Flicker (screen)
From Wikipedia, the free encyclopedia
-
For other uses of the word, see Flicker.
Flicker is visible fading between image frames displayed on cathode ray tube (CRT) based monitor. Flicker occurs when the monitor's CRT is driven at a low refresh rate, allowing the screen's phosphors to lose their excitation (afterglow) between sweeps of the electron gun.
For example, if a CRT computer monitor's vertical refresh rate is set to 60 Hz, most monitors will produce a visible "flickering" effect, unless they use phosphor with long afterglow. Most people find that refresh rates of 70-80 Hz and above enable flicker-free viewing on CRTs. Refresh rates above 120 Hz are uncommon, as they provide no noticeable flicker reduction.
Since flat panel displays use Active-matrix liquid crystal displays which use a transistor for each pixel make the pixel keep its state, they do not flicker, at least not based in the screen update rate. They may however show flicker from the backlight, but it should normally be refreshed at rather high frequencies.
The exact refresh rate necessary to prevent the perception of flicker varies greatly based on the viewing environment because phosphor. In a completely dark room, a sufficiently dim display can run as low as 30 Hz without visible flicker. At normal room and TV brightness this same display rate would produce flicker so severe as to be unwatchable.
Another factor in detecting flicker is peripheral vision. The human eye is most sensitive to flicker at the edges of our field of view, and least sensitive at the center of gaze (the area being focused on). As a result, the greater portion of our field of view that is occupied by a display, the greater is the need for high refresh rates. This is why computer monitor CRTs usually run at 70 to 80 Hz, while TVs, which are viewed from further away, are seen as acceptable at 60 or even 50 Hz. (see PAL and NTSC TV)
Contents |
[edit] Refresh rate versus Frame rate
In evaluating flicker, it is important to distinguish between refresh rate - how often a screen is illuminated, and frame rate - how often the image being displayed can change. Motion picture film has a standardized frame rate of 24 Hz, meaning that there are 24 photographs projected in each second. However, to reduce the flicker that such a low refresh rate would cause, each image is illuminated twice before the film advances to the next frame. The result of this is that the viewer does not notice the 48 black periods per second anymore.
Analog television sets never vary their refresh rate - NTSC always uses 59.94 Hz, so source material which is slower, such as film, needs to be repeated as needed to produce the NTSC refresh rate. Thus, low frame rate cannot cause flicker on TV, but can cause jerky motion, as is the case with old silent film that runs at 12-20 Hz.
[edit] Hardware artifacts
Flicker can also refer to the phenomenon which occurs on computers and gaming consoles when many objects must be displayed and hardware limitations prevent all of them from being rendered simultaneously. In these cases, one screen refresh may display the first set of objects while the next screen refresh will display the remaining ones. When observed in real time, the viewer will see all of the objects, though they appear and disappear rapidly.
[edit] Software artifacts
Flicker, a flashing effect displeasing to the eye, often occurs through flaws in software, with no hardware faults involved. Flicker in software is caused by a computer program's failure to consistently maintain its graphical state. For example, the data in the video hardware, resulting from program drawing, may momentarily include a white rectangle before the text that belongs there is 'drawn.' Normally, this form of flicker manifests as random banding and tearing, but if the update period is long enough the entire region may flicker by being frequently and completely blank, or in some other intermediate state. This version of flicker is endemic in the Windows operating system in part because of mechanisms that encourage programmers to place intermediate pixel states into video memory, e.g. the WM_ERASEBKGND message. Below are some programming methods to prevent this form of flicker:
[edit] Disable CS_VREDRAW and CS_HREDRAW
These window class styles direct Windows to cause an entire program window to be redrawn when the window is resized. Disabling these styles causes only the areas uncovered by resizing to be invalidated for painting. However, flicker may still occur at the edges or when data is updated for other reasons.
[edit] Disuse WM_ERASEBKGND
Whenever possible, programmers should refrain from the practise of blanking an area, then drawing 'on top' of it, because this practise makes it possible for the blank region to appear momentarily onscreen. Instead, each pixel should be set only once. When this is not feasible, double-buffering is a good last resort.
[edit] Employ Double buffering
Double buffering is often used for complex GUI objects such as web pages, where isolating individual changes would be too difficult. The method involves creating an offscreen drawing surface (in GDI, a bitmap and device context), drawing to it, and then blitting it all at once to the screen. While this technique completely eliminates flicker, it can also be very inefficient.
[edit] Slang
Flicks is an old English expression referring to the Cinema or building where films are shown. Example usage, "Are you going to the flicks tonight?". Less often used today. More recently replaced with the words cinema or "The Pictures" (which is in turn derived from "the picture house").