In this blog post, we will look at the identity of the black band that appears when you take a picture of a TV or monitor screen with a camera, the scientific causes of it, and simple solutions.
Have you ever taken a picture of a TV or computer screen with a camera? Did you notice anything strange at the time? You can often see a black band appear when you take a picture of a screen with your phone that was clearly visible to the naked eye. This can make you feel frustrated because the camera cannot capture the clear screen in front of you. What is wrong?
To understand this phenomenon, we must first know that the small light bulbs that make up the display are not continuously emitting light. In a display, as the screen changes continuously, a new screen must be displayed, and to do so, the numerous light bulbs, or LEDs, must be turned on and off in short cycles. At this time, the bulbs flicker in horizontal rows, which are called “scanning lines.” The display divides these scanning lines into odd and even rows and quickly crosses them to display the screen. Therefore, the identity of the black band is part of the intersecting scanning lines, which is the actual display screen.
However, this is not visible to our eyes. This is due to the phenomenon of “afterimage” in the eyes. An afterimage is a phenomenon in which visual activity remains in the visual organs for a short time after the removal of light stimulation. In other words, it is a phenomenon in which the image you just saw remains in front of your eyes for a short time. This lasts for about 1/16 of a second, and visual changes that are faster than that cannot be perceived. This phenomenon also plays an important role in video media such as movies and animations. In fact, the images we see are nothing more than a rapid succession of numerous still images. However, thanks to the afterimage effect, we do not recognize the individual frames and mistake them for continuous movement. Similarly, in the case of displays, the afterimage effect hides the flickering scan lines.
Let’s return to the topic of displays for a moment. In displays, there is a term called “scanning rate,” which refers to the number of times a new screen is displayed per second. Most displays on the market have a scanning rate of 60Hz, meaning that the light bulbs blink every 1/60th of a second. In other words, our visual organs remember the stimulus for about 1/16 of a second after the image disappears, while the display flickers at a fast speed of 1/60 of a second. So even for a very short time when the LED light is off, the afterimage fills it. Because of this, our eyes do not see the moment when the scan lines intersect, but perceive it as a continuous screen.
However, there is no afterimage in a camera. It simply takes a picture in a very short moment. Therefore, in a camera that takes pictures in extremely short time units, such as a few hundredths of a second or a few thousandths of a second, all the crosshairs are captured. In that split second, the LED that is lit appears as the original screen, and the LED that is off appears as the “black band” that we saw.
So, how can we prevent this streak? Instead of the afterimage phenomenon, there is something called “exposure.” A typical example of using this exposure function is when a photographer takes a picture of fireworks or when a photojournalist takes a picture of the rotation of the stars in the night sky. Just as our eyes dilate to collect as much light as possible in a dark place, a camera also opens its aperture (the pathway through which light enters) for a long time to accumulate light in a dark place. This time is called the exposure time. As the name suggests, the movement during the exposure time is revealed in the photo as it “accumulates” light. This can replace the role of the afterimage that is not present in the camera.
In the end, you can simply increase the exposure time. This is because the longer the exposure time, the longer the aperture will remain open for all the LEDs on the display to turn on and off. Since the afterimage in our eyes lasts about 1/16 of a second, adjusting the exposure time to at least 1/10 of that duration will produce a smooth and clear display picture. If the exposure time is adjusted to a longer value, the black band caused by flickering will not appear. Of course, as the exposure time increases, the amount of light received increases, making the screen brighter, so the brightness must be adjusted appropriately using the aperture and ISO values that can adjust the brightness.
In other words, the black band that was invisible to our eyes was not a “strange thing” but the actual display, the flickering “scan line.” It’s just that our eyes were tricked and didn’t notice it. The human eye is a truly all-purpose camera that can adjust all functions, such as contrast, perspective, and focus, to “AUTO,” but when it comes to the display, isn’t the camera seeing better than our eyes?
With the development of cameras, we have entered the world of digital images. The world captured by a camera records reality in a different way from the human eye, and the difference sometimes provides a new visual experience. In the end, what we “see” does not simply depend on what we see with our eyes, but on how we perceive and accept it, so devices like cameras are tools that expand our vision.