Interlace

From Wikipedia, the free encyclopedia

For the method of incrementally displaying raster graphics, see Interlace (bitmaps).
For the decorative motif used in ancient European and Celtic art, see Migration Period art and Celtic knot.

Interlace is a technique of improving the picture quality of a video transmission without consuming any extra bandwidth. It was invented by RCA engineer Randall C. Ballard in the 1930s [1] [2]. It was ubiquitous in television until the 1970s, when the needs of computer monitors resulted in the reintroduction of progressive scan. While interlace can improve the resolution of still images, it can cause flicker and various kinds of distortion. Interlace is still used for most standard definition TVs, and the 1080i HDTV broadcast standard, but not for LCD, micromirror (DLP), or plasma displays, which are inherently progressive scan. These devices require some form of deinterlacing which can add to the cost of the set. Nevertheless as of 2006, progressive displays now dominate the HDTV market.

Contents

[edit] Description

With progressive scan, an image is captured, transmitted and displayed in a path similar to text on a page: line by line, from top to bottom.

The interlaced scan pattern in a CRT (cathode ray tube) display completes such a scan too, but only for every second line. This is carried out from the top left corner to the bottom right corner of a CRT display. This process is repeated again, only this time starting at the second row, in order to fill in those particular gaps left behind while performing the first progressive scan on alternate rows only.

Such scan of every second line is called a field. The afterglow of the phosphor of CRT tubes, in combination with the persistence of vision results in two fields being perceived as a continuous image which allows the viewing of full horizontal detail with half the bandwidth which would be required for a full progressive scan while maintaining the necessary CRT refresh rate to prevent flicker.

Odd field
Even field
Interlace scan
Image:Progressive_scan_odd.png Image:Progressive_scan_even.png Image:Interlacezoom.gif

-- Only CRTs can display interlaced video directly — other display technologies require some form of deinterlacing.

[edit] History

When motion picture film was developed, it was observed that the movie screen had to be illuminated at a high rate to prevent visible flicker. The exact rate necessary varies by brightness, with 40 Hz being acceptable in dimly lit rooms, while up to 80 Hz may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three bladed shutter: a movie shot at 16 frames per second would thus illuminate the screen 48 times per second. Later when sound film became available, the higher projection speed of 24 frames per second enabled a two bladed shutter to be used maintaining the 48 times per second illumination - but only in projectors that were incapable of projecting at the lower speed.

But this solution could not be used for television – storing a full video frame and scanning it twice would require a frame buffer, a method that did not become feasible until the late 1980s. In addition, the limits of vacuum tube technology required that CRTs for TV be scanned at AC line frequency in order to prevent interference. (This was 60 Hz in the US, 50 Hz Europe.) In 1936 when the analog standards were being set in the UK, CRTs could only scan at around 200 lines in 1/50th of a second. By using interlace, a pair of 202.5-line fields could be superimposed to become a sharper 405 line frame. The vertical scan frequency remained 50 Hz, so flicker was not a problem, but visible detail was noticeably improved. As a result, this system was able to supplant John Logie Baird's 240 line mechanical progressive scan system that was also being used at the time.

After the Second World War, improvements in technology allowed the US and the rest of Europe to adopt systems using progressively more bandwidth to scan higher line counts, and achieve better pictures. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the 525 line system known as NTSC, Europe adopted the 625 line system, and the UK switched from its 405 line system to 625 in order to avoid having to develop a unique method of color TV. France switched from its unique 819 line system to the more European standard of 625. It should be noted that although the term PAL is often used to describe the line and frame standard of the TV system, this is in fact incorrect and refers only to the method of superimposing the colour information on the standard 625 line broadcast. The French adopted their own SECAM system which was also adopted by some other countries, notably Russia and its satellites. PAL has been used on some otherwise NTSC broadcasts notably in Brazil.

Modern monitors and television sets use active-matrix liquid crystal displays or other display technologies which do not have to have the afterglow characteristics of CRT displays as the individual pixels are continuously illuminated. Thus they do not have any flicker, and can display material which is progressively scanned without flicker and smooth motion because of the motion-blur effect used by film. However, newer digital CRT displays have much better picture quality than LCDs, DLPs, or plasma displays, but newer technologies such as SED, laser TV, OLED, and carbon nanotube will be superior to any technology currently available to consumers. DVDs, which in fact have interlaced content when pure video source material is used, have to be converted. Film sourced material, will have adjacent frames the same, but the lines are still recorded in even/odd order (reversed from the actual video standard for the PAL format).

[edit] Application

Interlacing is used by all the analogue TV broadcast systems in current use:

  • PAL: 50 fields per second, 625 lines, odd field drawn first
  • SECAM: 50 fields per second, 625 lines
  • NTSC: 59.94 fields per second, 525 lines, even field drawn first

[edit] Benefits of interlacing

With any video system there are trade-offs. One of the most important factors is bandwidth, measured in Megahertz (for analog video), or bit rate (for digital video). The greater the bandwidth, the more expensive and complex the entire system (camera, storage systems such as tape recorders or hard disks, transmission systems such as cable television systems, and displays such as television monitors).

Interlaced video reduces the signal bandwidth by a factor of two, for a given line count and refresh rate.

Alternatively, a given bandwidth can be used to provide an interlaced video signal with twice the display refresh rate for a given line count (versus progressive scan video). A higher refresh rate reduces flicker on CRT monitors. The higher refresh rate improves the portrayal of motion, because objects in motion are captured and their position is updated on the display more often. The human visual system averages the rapidly displayed still pictures into a moving picture image, and so interlace artifacts aren't usually objectionable when viewed at the intended field rate, on an interlaced video display.

For a given bandwidth and refresh rate, interlaced video can be used to provide a higher spatial resolution than progressive scan. For instance, 1920x1080 pixel resolution interlaced HDTV with a 60 Hz field rate (known as 1080i60) has a similar bandwidth to 1280x720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60), but approximately 50% more spatial resolution. (Note that this ignores the results of data compression, which tends to be more efficient when applied to progressive scan video.)

[edit] Problems caused by interlacing

Freeze-frame of an interlaced transmission displayed on a progressive display with the simple "weave" method. Combing is clearly visible in the full-size picture.
Freeze-frame of an interlaced transmission displayed on a progressive display with the simple "weave" method. Combing is clearly visible in the full-size picture.

Interlaced video is designed to be captured, transmitted or stored and displayed in the same interlaced format. Because interlaced video is composed of 2 fields that are captured at different moments in time, interlaced video frames will exhibit motion artifacts when both fields are combined and displayed at the same moment. However, many types of video displays, such as Liquid Crystal Display and Plasma display are designed as a progressive scan monitor; they are designed to illuminate every horizontal line of video with each frame. If these progressive scan monitors display interlaced video, the resulting display can suffer from reduced horizontal resolution or motion artifacts. These artifacts may also be visible when interlaced video is displayed at a slower speed than it was captured, such as when video is shown in slow motion.

Because modern computer video displays are progressive scan systems, interlaced video will have visible artifacts when it is displayed on computer systems. Computer systems are frequently used to edit video and this disparity between computer video display systems and television signal formats means that the video content being edited can not be viewed properly unless separate video display hardware is utilized.

To minimize the artifacts caused by interlaced video display on a progressive scan monitor, a process called deinterlacing can be utilized. This process is not perfect, and it generally results in a lower resolution, particularly in areas with objects in motion. Deinterlacing systems are integrated into progressive scan television displays in order to provide the best possible picture quality for interlaced video signals.

Interlace introduces a potential problem called interline twitter. This aliasing effect only shows up under certain circumstances, when the subject being shot contains vertical detail that approaches the horizontal resolution of the video format. For instance, a person on television wearing a shirt with fine dark and light stripes may appear on a video monitor as if the stripes on the shirt are "twittering". Television professionals are trained to avoid wearing clothing with fine striped patterns to avoid this problem. High-end video cameras or Computer Generated Imagery systems apply a low-pass filter to the vertical resolution of the signal in order to prevent possible problems with interline twitter.

This animation demonstrates the interline twitter effect. The interlaced images use half the bandwidth of the progressive one. The center image precisely duplicates the pixels of the progressive one, but interlace causes details to twitter. Real interlaced video blurs such details to prevent twitter, but as seen on the right, such softening (or anti-aliasing) comes at the cost of resolution. A line doubler could not restore the image on the right to the full resolution of the image on the left. Note - Because the frame rate has been slowed down, you will notice additional flicker in simulated interlaced portions of this image.

Despite arguments against it and the calls by many prominent technological companies, such as Microsoft, to leave interlacing to history, interlacing continues to be supported by the television standard setting organizations, still being included in new digital video transmission formats, such as DV, DVB (including its HD modifications), and ATSC.

[edit] Interlace and computers

In the 1970s, computers and home video game systems began using TV sets as display devices. At this point, a 480-line NTSC signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal which caused each video field to scan directly on top of the previous one, rather than each line between two lines of the previous field. This marked the return of progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this 240p on NTSC sets, and 288p on PAL. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as CGA were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT.

By the 1980s computers had outgrown these video systems and needed better displays. Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8 MHz of bandwidth that NTSC and PAL signals were confined to. Apple Macintosh built a custom 342p display into their case, and EGA for DOS PCs was 350p. The Commodore Amiga created a true properly interlaced NTSC signal (as well as RGB variations). This ability resulted in the Amiga dominating the video production field until the mid 1990s, but the interlaced display mode caused flicker problems for more traditional PC applications. 1987 saw the introduction of VGA, which Macs and PCs soon standardized on.

In the early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at very high refresh rates, intending that this would alleviate flicker problems. Such monitors proved very unpopular. While flicker was not obvious on them at first, eyestrain and lack of focus nevertheless became a serious problem. The industry quickly abandoned this practice, and for the rest of the decade all monitors included the assurance that their stated resolutions were "non-interlace". This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard.

[edit] See also

[edit] References

  1. ^ Pioneering in Electronics (English). David Sarnoff Collection. Retrieved on 2006-07-27.
  2. ^ U.S. Patent 2,152,234 

[edit] External links