720p
From Wikipedia, the free encyclopedia
720p is the shorthand name for a category of HDTV video modes. The number 720 stands for 720 lines of vertical display resolution, while the letter p stands for progressive scan or non-interlaced. When broadcast at 60 frames per second, 720p features the highest temporal (motion) resolution possible under the ATSC standard. Progressive scanning reduces the need to prevent flicker by filtering out fine details, so spatial (sharpness) resolution is much closer to 1080i than the number of scan lines would suggest.
Contents |
[edit] Specifications
720p assumes a widescreen aspect ratio of 16:9, a vertical resolution of 720 pixels and a horizontal resolution of 1280 pixels for a total of about 0.92 million pixels. The frame rate (in this case equal to the field rate) can be either implied by the context or specified in hertz after the letter p. The five 720p frame rates in common use are 24, 25, 30, 50 and 60 Hz (or frame/s). In general, traditional PAL and SECAM countries (Europe, Australia, much of Asia, Africa, and parts of South America) are or will be using the 25p and 50p frame or field rates, whereas traditional NTSC countries (North and Central America, Japan, South Korea, Philippines) are using 24p (for movies), and 60p for high motion programming. All variants can be transported by both major digital television formats, ATSC and DVB.
[edit] Compatibility
720p is directly compatible with newer flat-panel technology such as plasma and LCD, which are inherently progressive and must perform deinterlacing to display 1080i source material. 720p must be scan-converted for display on most CRT-based consumer televisions, which are generally interlaced-only display devices.[1] However, CRTs intended for use as computer monitors are progressive-only devices that are generally able to support 1280×720 at 60 Hz.
[edit] History
720p was designed at AT&T Bell Laboratories in the late 1980s, under the supervision of Arun Netravali. The project began when Zenith approached AT&T to partner in the design of an analog HDTV format, comparable to the Japanese system. Netravali (in Murray Hill), along with Barry Haskell (in Holmdel) and other image processing experts at Bell Labs, and William Schreiber at MIT, quickly devised a digital standard using DCT block coding. About 50 engineers were hired and a prototype was assembed in Murray Hill using Xilinx programmable logic hardware. The leaders of Zenith and AT&T cancelled the analog-HDTV project after the completion of the digital 720p experimental system, and Zenith agreed to design a radio-frequency modem system for broadcasting digital video. The 720p system was tested against competing standards during FCC trials, and was particularly notable for its lack of flicker and shimmer of moving edges.[citation needed] The conflict between interlaced formats (supported by the television industry) and progressive scan formats (supported by AT&T, Microsoft and others) was extremely contentious in the early days of format proposals.
[edit] 720p versus 1080i
Some broadcasters use 720p50/60 as their primary high-definition format; others use the 1080i standard. While 720p presents a complete 720-line frame to the viewer between 24 and 60 times each second (depending on the format), 1080i presents the picture as 50 or 60 partial 540-line "fields" per second (24 complete 1080-line fields, or "24p" is included in the ATSC standard though) which the human eye or a deinterlacer built into the display device must visually and temporally combine to build a 1080-line picture - in Chase Herrmann type display. To get all 1080 interlaced lines to appear on the screen at the same time on a progressive high-definition display, the processor within the HD set has to weave together both 540-line segments to form the full-resolution frame. It does so by holding the first field in its memory, receiving the next field, then electronically knitting the two fields together. The combined fields are displayed at once as a complete 1080p frame. The main tradeoff between the two is that 1080i shows more detail than 720p for a stationary shot of a subject, at the expense of a lower effective refresh rate and the introduction of interlace artifacts during motion.
While 1080i has more scan lines than 720p, they do not translate directly into greater vertical resolution. Interlaced video is usually blurred vertically (filtered) to prevent twitter. Twitter is a flickering of fine horizontal lines in a scene, lines that are so fine that they only occur on a single scan line. Because only half the scan lines are drawn per field, fine horizontal lines may be missing entirely from one of the fields, causing them to flicker. Images are blurred vertically to ensure that no detail is only one scan line in height. Therefore, 1080i material does not deliver 1080 scan lines of vertical resolution. However 1080i provides a 1920-pixel horizontal resolution, greater than 720p's 1280 resolution.
In the USA, 720p is used by ABC, Fox Broadcasting Company and ESPN because the smoother image is desirable for fast-action sports telecasts, whereas 1080i is used by CBS, NBC, HBO, Showtime and Discovery HD due to the crisper picture particularly in non-moving shots.
The European Broadcasting Union (EBU) recommends that its members use 720p50 with the possibility of 1080i50 on a programme-by-programme basis and 1080p50 as a future option.[2][3][4] The BBC is one of the EBU members transmitting in HDTV. It has not yet made a final decision on picture scanning format. Sveriges television in Sweden, Cyfra+ in Poland, SRG SSR idée suisse in Switzerland and ORF in Austria broadcast in 720p50. All other commercial European HDTV services so far use 1080i50.
|
[edit] See also
[edit] References
- ^ archive2.avsforum.com/avs-vb/showthread.php?t=489948. Retrieved on 2007-07-06.
- ^ EBU Technical Recommendation R112 - 2004. Retrieved on 2007-07-06.
- ^ EBU Technical Recommendation R112 - 2004. Retrieved on 2007-07-06.
- ^ EBU Technical Review. Retrieved on 2007-07-06.
[edit] External links
|