Image talk:Resolution chart.svg
From Wikipedia, the free encyclopedia
Contents |
[edit] Link back to Resolution chart.svg
http://en.wikipedia.org/w/index.php?title=Image:Resolution_chart.svg
For some reason, the "image" tab above links to Editing Image:Resolution chart.svg, rather than the original graphic. Tvaughan1 16:54, 1 October 2006 (UTC)
[edit] This Resolution chart needs improvement in several areas
Hi all.
I'd like to propose a few improvements, and I'd like to work with whomever the original author of the chart is (it is unclear to me... but I've written to everyone who is listed as having edited this image).
The description for the image currently reads ... "It does not accuratly reflect the screen shape (aspect ratio) of these formats, which is always stretched or squeezed to 4:3 or 16:9. The table assumes an average vertical detail loss of .75x due to interlace. The actual loss is variable due to content, motion, opinion on acceptable levels of flicker, and possible success of deinterlacing. 1920x1080i is not included because all common use of 1080i is filtered to 1440 or less."
First, I propose that the resolution chart simply show resolutions of different television video specifications. I don't understand why interlace artifacts are discussed or factored into this chart. It would seem to be better to reference interlace artifacts as a separate discussion. The resolution of HDTV is identical for interlaced or progressive scan formats for the same signal type (1920x1080 or 1280x720). The current chart is totally misleading, as it seems to indicate that both the vertical and horizontal resolution of 1080i is less than 1080p. This is not the case... the resolution of these 2 formats is 1920x1080. Motion artifacts are not the issue here. There is no vertical detail loss of .75x... HDTV is a digital video standard, and both 1080i and 1080p use 1080 horizontal lines. 1080i is shot AND displayed using interlacing... deinterlacing is only necessary if the display technology is both progressive AND slower than 60 Hz (unlikely).
I don't understand the justification for the statement "all common use of 1080i is filtered to 1440 or less". 1080i is defined by SMPTE 274M, and is broadcast in the US using ATSC. Neither of these specifications define or filter the horizontal resolution to 1440. Only HDV uses anamorphic pixels to record 1440x1080 as 1920x1080. Most of the broadcast 1080i material is shot and edited with full 1920x1080 resolution equipment. Most consumers don't have HDV camcorders, nor do they care that HDV uses anamorphic pixels to record a video frame using 1440x1080 sensors, despite the fact that the picture will be displayed as 1920x1080. 1440x1080 is a topic for the HDV page only.
The problem with this chart is compounded by the fact that it has been referenced on so many pages.
I look forward to comments from interested Wikipedians. Tvaughan1
[edit] Resolution Chart
(copied from User_Talk:Algr)
Algr, Are you the original author of this resolution chart? http://en.wikipedia.org/wiki/Image:Resolution_chart.svg
I have some questions about this chart, and the description... "It does not accuratly reflect the screen shape (aspect ratio) of these formats, which is always stretched or squeezed to 4:3 or 16:9. The table assumes an average vertical detail loss of .75x due to interlace. The actual loss is variable due to content, motion, opinion on acceptable levels of flicker, and possible success of deinterlacing. 1920x1080i is not included because all common use of 1080i is filtered to 1440 or less."
I don't understand why interlace artifacts are discussed or factored in to this chart. It would seem to be better to reference this as a separate discussion. The resolution of HDTV is identical for interlaced or progressive scan formats for the same signal type (1920x1080 or 1280x720).
I don't understand the justification for the statement "all common use of 1080i is filtered to 1440 or less". 1080i is defined by SMPTE 274M, and is broadcast in the US using ATSC. Neither of these specifications filter the horizontal resolution to 1440. Only HDV uses anamorphic pixels to record 1440x1080 as 1920x1080. Most of the broadcast 1080i material is shot and edited with full 1920x1080 resolution equipment.
This chart has been placed on many different wikipedia pages, and I am interested in working with the author(s) to improve it.
Thank you!
Tom Vaughan User:Tvaughan1 19:46, 27 September 2006 (UTC)
- Yes, I created that chart. The reason for the .75x interlace loss is that pixels do not automatically translate into detail. While you can COUNT 1080 lines in 1080i, you can't USE them as you would in 1080p - you must blur out details that would cause one field to become brighter then the other. This blurring always happens before broadcast, (usually in the camera) and can't be undone by a de-interlacer - the detail is removed, not just hidden. This is why VGA is so much sharper then NTSC, even if the NTSC has component inputs. The chart is intended to show how much detail you can actually SEE with these formats.User:Algr
This isn't true at all. Why would one field become brighter than another, just because there are 2 interlaced fields per frame? Why would blurring reduce brightness? Do you have any references for this idea? Have you ever seen broadcast quality NTSC? It's sharper than VGA... roughly 800 lines of horizontal resolution. Yes... motion of the camera or subject can cause a reduction in detail when the video is shot and displayed with interlacing... but this is a separate subject than the native resolution of the signal format. If the entire capture, storage/transmission and display system are interlaced, motion artifacts are nearly zero (which is what NTSC or 1080i on a CRT are designed to be). I understand that displaying interlaced video on a native-progressive scan monitor will introduce deinterlacing... but that is a separate subject.
- As for the horizontal resolution, it is not just HDV, but most HD formats that run at 1440. DVCpro is even lower: 1280 x 1080i pixels. Even uncompressed formats reduce pixel count. Attack of the Clones was shot with 1440 horizontal pixels. Newer high end gear can shoot true 1920, but this is only being used in 24p mode. Even when such is available, (Revenge of the Syth) broadcasters have found that they get too many artifacts if they try to broadcast it that way - 19 mbps just isn't enough data. So they are filter any 1080i down to 1440 or less, so that the softer images will look cleaner. Blu Ray and HD-DVD run at higher data rates, so in theory, they could use 1920x1080i, but in practice, all the content that people want to buy disks of is 1080p, and/or film. Hence 1920x1080i is not "in common usage", and isn't included. User:Algr
HDV and DVCPro are consumer and semi-professional formats. They aren't main-stream professional formats... they are second-tier formats for people who can't afford the real deal. HDTV programs that you watch on TV aren't shot, edited, or broadcast in these formats.
The standard professional high-definition interface is SDI (SMPTE 259M). This interface transfers the high definition video signal uncompressed. The data rate of an uncompressed HDTV (HD-SDI) is 1.485 Gbit/s. 1080i HDTV is broadcast using ATSC.
Star Wars "Attack of the Clones" was shot in 1080p24 with Sony HDW-F900 Cinealta Cameras - which use full 1080 resolution sensors... http://bssc.sel.sony.com/Professional/docs/brochures/hdwf900_v11122b.pdf http://www.imdb.com/title/tt0121765/technical
1080i and 1080p are 1920x1080 signals. There is no such thing as 1440x1080 HDTV... even if the sensor is 1440x1080, or if the signal is down-sampled, or if the display is 1440x1080.
True, HDCAM cameras downsample to 1440x1080... but other cameras and storage systems don't down-sample... and live 1080i broadcasts aren't downsampled.
My concern with the resolution chart is that you are showing that 1080i has a native resolution less than 1920x1080... and it doesn't. Done right, 1080i has 1920x1080 resolution.
We should carry on this discussion on the talk page for the image... OK? User:Tvaughan1 19:13, 29 September 2006 (UTC)
[edit] Pixel count does not equal sharpness
The image is an accurate representation of how much detail people can expect to see when those systems are used. You don't need a chart just to tell you that one number is larger then another. Read the article on interlace. Here is an important image:
Notice that the three images are all the identical 96 pixels tall. The middle image shows what happens if you don't blur out details when an image is interlaced. Notice the flag and edges shimmering? All real interlaced video has to be blurred as on the right to reduce this flicker.
VHS and DVD are both "480i" but that doesn't mean that both have the same picture quality. The issue is the same with 1440 horizontal pixels. Even though they are technically transmitting 1920 pixels, they can't USE them without choking MPEG, so everything is getting filtered down. The reference you gave is an ad flyer, so you have to be careful how you read it. They do not say the CAMERA provides 1920 pixels, they say that the CIF standard does. If you converted a VHS tape to CIF then it too would have 1920 pixels - but it wouldn't LOOK any sharper. Pixel count does not equal sharpness. Algr 20:42, 29 September 2006 (UTC)
- Your example is not an accurate representation of how interlaced video is displayed... as mentioned by a couple of people on the talk page for the Interlace article... Talk:Interlace. Interlaced video reduces flicker... it doesn't cause flicker. The phenomenon that I think you are trying to describe is called "twitter". This phenomenon is noticed when the vertical detail of an object approaches the vertical resolution of the video capture / display technology. Twitter is not unique to interlaced video... it occurs with progressive scan systems also.
- If you are living in the progressive scan (computer video) world, and you have never worked professionally with NTSC or PAL video systems, it would take many pages of explanation to help you understand how interlaced video systems work... and work well. To understand interlaced video systems, you need to consider the fact that the entire system; camera, storage, transmission, and display works with the same interlaced signal. The signal is NEVER deinterlaced (at least, that is the way interlaced systems worked for more than 60 years). Interlace artifacts are ONLY an issue if one attempts to view a static frame of video with 2 interlaced fields, or if we have a need to convert interlaced video to progressive scan (for display on an inherently progressive scan monitor, for instance).
- VHS is not "480i"... it is an analog video standard. The horizontal resolution of VHS is limited by its bandwidth (about 3 MHz). The reference I gave on your talk page was the camera's spec sheet, which shows that the camera has 2.2 megapixel sensors.
- "choking MPEG"? MPEG compression uses both spatial and temporal compression. Generally you will have a better quality image if you start with more resolution before you figure out how to throw away bits. MPEG compression quality is highly dependent on the type of video picture... the motion, the detail, the rate of scene changes, and so on. The folks who developed the ATSC 1080i video signal standard knew what they were doing. MPEG-2 compression works just fine at 1920x1080 resolution, 30 frames / 60 fields per second. Tvaughan1 18:44, 30 September 2006 (UTC)
-
- Tvaughan1, I've worked extensively with analog video since the late 1980s starting with 3/4 SP. I was designing and maintaining Betacam SP online editing suites back in 1991, and did my first digital edits in 1995. The information I've posted is accurate based on both trade publications and my own experience.
- I think you are getting caught up with the verbal sense of what some of the terminology sounds like at the expense of what is actually happening. For example, regardless of VHS's bandwidth, it still produces exactly the same 525 NTSC lines of resolution that a DVD's composite out would have. In that sense, it is 480i because it is NTSC. The fact that VHS isn't digital misses the point I was making, which is the title of this section.
- We fixed the tearing issue that the others had with the .gif by slowing it down. Only some computers saw this problem.
- Throwing more and more pixels at the same digital bandwidth and codec is NOT always going to give you a better picture. Beyond a certain point, it will become less efficient. MPEG 2 did not improve as fast as the designers of HDTV thought it would.
- The only sense in which interlace reduces flicker (or twitter) is compared to maintaining the same line count, but cutting the refresh rate in half. But no one would ever design an analog video system to work that way, as it would be unwatchable. Progressive 'NTSC' does exist - older videogames like the Super Nintendo generate 240p video at 60 frames per second. It it does have less twitter and flicker then legal NTSC, as well as cleaner motion. (But of course you can't broadcast such a signal, and it isn't as sharp as properly interlaced TV.)
- You have to read things carefully in video, as there is a fair amount of bad terminology out there - confusingly named technical issues that are not what they sound like. Not to mention outright misinformation. People tend to believe the first thing they read, and assume that other descriptions they see later must be wrong - you have to examine and experiment with real video closely and see which description matches what you are actually seeing. Otherwise you can be lead into an ideological trap - witness the film purists that insist that HD cinema is no sharper then 16mm despite all visual evidence to the contrary. Algr 22:10, 30 September 2006 (UTC)
- Tvaughan1, I've worked extensively with analog video since the late 1980s starting with 3/4 SP. I was designing and maintaining Betacam SP online editing suites back in 1991, and did my first digital edits in 1995. The information I've posted is accurate based on both trade publications and my own experience.
(Cross post - below was written at the same time as above.)
- One of the first rules that video professionals understand is that you can not look at individual frames of video, or slowed down video in order to make a fair comparison of the quality of different systems. You can't evaluate interlaced video on a progressive scan display. If you evaluate an MPEG encoder you might notice lots of artifacts when viewing still frames that are not perceptible when viewed in real-time. To understand video systems you must first understand the human visual system, and how detail, color, and motion are perceived. Interlace works because the temporal sensitivity of the human visual system decreases at higher spatial frequencies. The job of the video system is to deliver a good moving image... not great still frames. 1080p is not inherently superior to 1080i if the entire system is optimized for interlaced video (1080i). 1080i30 updates motion 60 times per second while 1080p30 only updates motion 30 times per second... at the same bandwidth. Due to the persistence of the phosphors on a CRT television, and due to the gaussian spot profile of the beam, the human visual system doesn't perceive the "blurring" of the vertical edges of objects in motion to be objectionable. In fact, this type of distortion is less objectionable than the flicker that occurs with progressive scan display systems that only update an object's position 30 times per second. Perception of flicker is dependent on a number of factors, such as the brightness level of the display and the ambient lighting in the room. But flicker can be perceived at 30 Hz. Tvaughan1 21:46, 30 September 2006 (UTC)
-
- " progressive scan display systems that only update an object's position 30 times per second." - No real CRT monitor scans at 30 hz - it would be like staring into a strobe light. If you are talking about 30 fps vrs 60 fps, that is a totally different subject then interlace vrs progressive - both formats can put new images on screen 60 times per second. Algr 22:16, 30 September 2006 (UTC)
-
[edit] 2
- Algr - I'm glad to hear that I'm discussing this topic with a professional. I am a professional also... I have a B.S. in Electrical and Computer Engineering, and I work in the DVD business. You'll have to trust me when I say that I'm not confused by any of the terminology that we are discussing, nor have I fallen into a trap. I've done my homework on the subject, and I've worked with some of the best experts in the field, including major movie studios and post production houses. I can say with confidence that 1080i HDTV has full 1920x1080 resolution, just as 1080p does. In fact, 1080i can reproduce horizontal motion more smoothly than 1080p, since it has twice the field rate at 30 frames per second. Let me recommend Charles Poynton's excellent book "Digital Video and HDTV".
- My original point was - 1080i has a pixel resolution of 1920x1080. I think that a graphic that shows 1080i having a resolution of 1440 x 720... roughly... is misleading. Referring to a separate article which discusses the pros and cons of interlaced versus progressive scan video is appropriate. Subjectively changing the resolution of the capture and display technology isn't helpful, it's misleading. The point of an article in an encyclopedia is to be factual, not subjective. If you have any sources for your claim that 1080i has a real resolution that is something other than 1920x1080, please reference them. Tvaughan1 02:57, 1 October 2006 (UTC)
- "There is absolutely no doubt among television engineers and viewers that interlaced systems deliver higher perceived spatial resolution: the interlaced systems tested at the ATTC demonstrated substantially better resolution than the progressive systems. It is for this reason that the broadcasting organizations such as the National Association of Broadcasters, the Association of Maximum Service Television, Cable Labs, and so on have been issuing press releases and position papers declaring their view that, at introduction, advanced television must be interlaced."
From [1] Tvaughan1 03:05, 1 October 2006 (UTC)
-
- I'm afraid you are still bringing irrelevant side issues into the discussion. The image says nothing about frame or field rates - those are totally beside the point here. Of course 60i is going to have clearer motion then 30p, but 720/60p is broadcastable now, and Sony's PS3 outputs 1080p with 60 frames per second. (TVs that can display this natively only just came out this year.) The limit against 1080/60p only applies to current over the air MPEG 2 broadcasts, not to HD disks or any future standard that might have more bandwidth or use MPEG 4.
-
- The article you link to is comparing 720p with 1080i - THAT is why they say that the interlaced video was sharper, and the current image reflects that. (The 1080i area is clearly larger) But there is a HUGE difference between the 1080p I've seen in a movie theatre and 1080p displays showing 1080i broadcasts.
-
- Please stop insisting that this chart is about numbers or pixels. It is about perceivable sharpness. Algr 07:45, 1 October 2006 (UTC)
Algr - you have not cited one reference for your claim that 1080i has a resolution that is less than 1920x1080. A number of Wikipedians have pointed out that your understanding of resolution and interlace is lacking, and yet you insist you are right, and we are all wrong. Your pulsing GIF "illustration" doesn't illustrate anything about how interlace works, or how analog to digital sampling works. It's so far off the mark... it isn't even close. I'm sure the Advanced Television Standards Committee had no idea what they were doing when they developed the 1080i ATSC specification. Anyhow... I think you need to cite some sources that justify your conclusions. Not "I've seen in a movie theatre" or vague references to "trade magazines"... real articles about the resolution of 1080i.
You need to realize that video professionals use reference monitors to evaluate video quality. While progressive display technologies are improving all the time, the best video quality still comes from CRT monitors... interlaced video displays. Professionals don't view 1080i broadcasts on 1080p monitors, only to conclude that 1080i is inferior to 1080p. In any case, your personal experience isn't justification enough for Wikipedia (nor is mine). We need to cite our sources. So far, I have seen NO justification whatsoever for a graphic that shows 1080i having a resolution less than 1920x1080. Tvaughan1 16:51, 1 October 2006 (UTC)
[edit] References:
Tvaughan1, if you want to change this, then you are the one who has to provide references saying that interlace and progressive don't make any difference to sharpness. You can't just walk into an article and start threatening to delete other people's work unless they do your research for you. (As I pointed out above, your lone reference above is out of context and misinterprets what they said. It is also to an article that is 13 years old, and clearly reflects only one side of the debate that went on then.) Nevertheless here are some references:
http://members.aol.com/ajaynejr/kell.htm - See interlace factor.
-
- Who is Albert Jayne Jr.? Maybe I should have been more clear... you need to cite authoritative references. My reference to Charles Poynton's article was not out of context at all... did you actually read it? It's spot on. Tvaughan1 23:42, 1 October 2006 (UTC)
http://www.evansassoc.com/lib/Atsc-dtv.html - ATSC's own chart. Note that both "Limiting Resolution Horiz. Vert." and "Displayed Pixels" list lower values for interlace:
- 1080p=972
- 1080i=756
- 720p= 648
-
- This appears to be Evans Associates chart... not ATSC's chart. HDTV signals are defined by SMPTE 274M... take a look http://www.smpte.org/smpte_store/standards/pdf/s274m.pdf Note table 1 on page 2. The ATSC Digital Televison Spec is here... http://www.atsc.org/standards/a_53e-with-Amend-1-and-2.pdf, but it references SMPTE 274M for the signal format (ATSC is concerned more with transmission of HDTV channels over the air or through cable systems). Both of these specs clearly show that 1080i has a resolution of 1920x1080. Tvaughan1 23:42, 1 October 2006 (UTC)
These numbers clearly place 1080i closer to 720p then to 1080p. It also states "Kell Factor used for interlace is 0.7 and for progressive is 0.9". This is direct from the people who designed these systems.
-
- Kell factor is something different, having to do with the gaussian spot profile of a CRT monitor. Interlace factor is another issue, having to do only with vertical resolution... not horizontal resolution. But while vertical resolution of interlaced video signals is low-pass filtered, in order to avoid aliasing, the net effect of interlace is to actually improve spatial resolution (while reducing flicker). This is because video is concerned with portraying objects in motion, and interlaced video updates the position of objects at twice the field rate of progressive scan video (for a given frame rate). Tvaughan1 23:42, 1 October 2006 (UTC)
And finally, if interlace had the same detail as progressive at the same line count, then why would progressive outputs on DVD players be so popular? Why did the PC industry spend millions to replace NTSC 480i with more expensive and incompatible VGA 480p monitors? Algr 19:00, 1 October 2006 (UTC)
-
- As I explained above, progressive scan output from DVD players is provided to allow consumers to feed a progressive scan signal to suitably equipped monitors. Since movies are shot on film, and since film is progressive scan, it is best to view the movie as a progressive scan signal, displaying 24 frames per second just as the movie was shot and edited. NTSC (region 1) DVDs can only store video as an interlaced video signal, 29.97 frames per second. A process called telecine is used to convert film frames to interlaced video fields, prior to MPEG encoding. Progressive scan DVD players perform inverse telecine to recover the original 24 frames per second from the 60 interlaced fields per second. There is no improvement in resolution that results from this. It is only done to avoid a motion artifact called "judder" which results from the telecine process. Tvaughan1 00:37, 2 October 2006 (UTC)
-
- When PCs were first developed the displays were designed to display text characters only: not still pictures, and not video. Since the PC monitor is just a few feet away from the PC, there isn't the same bandwidth limitation that television engineers had to consider for broadcast video standards. Since there was no motion to be concerned with, progressive scan displays made more sense than interlaced video displays. Tvaughan1 00:37, 2 October 2006 (UTC)
______________________
"A number of Wikipedians have pointed out that your understanding of resolution and interlace is lacking, and yet you insist you are right, and we are all wrong."
You need to read more carefully. The last comment critical of my animation was posted in March, by 221.28.55.68, I addressed all other concerns. The comments by Finest1 refer to Image:Progressive vs interlace.gif which I also criticized, and the creator ultimately deleted. You are on your on with your critisism. Algr 19:45, 1 October 2006 (UTC)
-
- On my talk page User_Talk:Tvaughan1 you can see another comment from another Wikipedian about this resolution chart.
[edit] Compromise
Let's limit the chart to the raw resolution of each format (not dependent on the content, the de-interlacer, the screen, or the perceived vertical resolution loss which cannot and should not be quantified), anything else is POV. In reality, by pixel count, broadcast 1080i is 1440x540 per field, 1440x1080 per frame (the PAR stretches it to 1920x1080). Of the two, I feel the latter should be used if a comparison is to be made to progressive frames. Noclip 14:05, 4 October 2006 (UTC)
- I agree. Pixel count is what should be shown in this graphic. Resolution is a somewhat subjective quantity, described as the number of white and black lines that can be perceived after broadcast and display. For instance, the maximum theoretical horizontal resolution of 1080i or 1080p is 960 lines (if every alternate pixel was white, then black, then white, etc.). Theoretical resolutions are rarely achieved in practice. There are a number of factors that can affect resolution, including the frequency response of analog components, digital compression, and other real-world factors that must be considered and they can be discussed in separate articles or sections. These factors shouldn't be randomly "factored in" to the pixel count of the broadcast standard. However, broadcast 1080i has a pixel count of 1920x1080 pixels per frame. Check the SMPTE 274M standard. Why would 1080i have decreased horizontal pixel resolution versus progressive scan? Answer: it doesn't. Tvaughan1 14:36, 4 October 2006 (UTC)
I disagree. You don't need a chart to explain to people that the number 1080 is larger then 720. The problem here is that Tvaughan1 is insisting that one pixel is always the same as another - But there is overwhelming evidence that this isn't true. The very existence of progressive scan formats such as VGA and 720p is due to the losses inherent to interlace. Consider these images:
Both pictures are 250x250 pixels. Does that prove that both are equally sharp? Of course not - but that is precisely what you are telling people if you just site pixel count, and leave issues like kell factor and twitter buried in the fine print. No one cares about pixels - they care about what they can see, so perceivable sharpness is what the chart is designed to display. Algr 20:36, 4 October 2006 (UTC)
- No, I'm insisting that 1080i has a pixel resolution of 1920 x 1080, the same as 1080p. Algr - you can't keep "making up" illustrations that aren't valid. Wikipedia is not a source for original work, and you keep bringing new and interesting original work here... but it isn't accurately representing the subject. Where on earth do you get a horizontal resolution of 1440 for 1080i? Did you read the HDTV spec that defines 1080i? You can't just arbitrarily change the numbers.
- OK... in the interest of trying to get somewhere, let me say that I agree that 1080p30 should have (theoretically... the devil is always in the details of the implementation) more perceived vertical resolution than 1080i60. However... 1080i and 1080p have identical horizontal resolution, and identical pixel resolution. 1080i has twice the field-rate of equivalent bandwidth 1080p... less motion artifacts (well... no motion artifacts). I think I'm going to go watch one of dozens of available 1080i broadcasts right now on my 1080i CRT HDTV.
- Look... we could argue about perceived resolution all year long and into next year. What "Resolution_chart.svg"" illustrates is not the "perceived resolution" of these video formats... it is the pixel resolution. Perceived lines of resolution is less than half the pixel resolution (or, "number of picture elements") for ALL of these formats. It is not debatable is that 1080i has a pixel resolution of 1920 x 1080, the same as 1080p. Any other number you "create" for 1080i resolution is arbitrary. There are full 1920x1080 pixel 1080i cameras and recording systems (expensive, but they exist and are in use today). Just because some consumer and professional formats capture 1080i (or 1080p) video using 1440x1080 pixel sensors doesn't mean that 1080i isn't 1920x1080. Tvaughan1 23:18, 4 October 2006 (UTC)
-
- Digital Television Magazine: http://www.digitaltelevision.com/future/future7.shtml
- Relevant quotes:
- What they don't tell you is that 1080 interlaced lines deliver only 540 lines of dynamic resolution. That's the measure of vertical detail when objects are moving. HDTV IS about the delivery of moving pictures...Right?
- Gaggioni then demonstrated what happens when uncompressed 1920 x 1035 source is encoded for emission at 19.3 Mbps in a split screen with filtered HDCam source encoded for emission at 1440 x 1035 resolution. The difference was highly visible, as the uncompressed full resolution source caused the encoder to produce high levels of noise and occasional blocking artifacts. The "lower resolution" HDCam source delivered a significantly higher quality picture!
- What happens when 1280 x 720 progressive scan images are encoded for emission? Less optical pre-filtering is required in the camera--we can have equivalent horizontal and vertical detail, as it is not necessary to limit the vertical resolution to prevent interlace artifacts.
- In my opinion first generation 720P equipment already produces higher quality images than 1080i. But 1080P is even better, especially if the images are acquired on film then digitized via a high quality telecine. The critical factor is the acquisition of frames not interlaced fields.
And there you have it. Clear reasons for reduced vertical and horizontal detail from 1080i, just like in my example above. Please stop insisting that my chart means what you think it should. It says in the first line of the description that it represents detail, not pixel count. Wikipedia has another chart that displays just pixels: [2] Algr 05:27, 5 October 2006 (UTC)
- I enjoyed the article... thanks for the link. I have a great deal of respect for Craig Birkmaier... he knows what he is talking about. Most of the article discusses the idea that it would be great to get the world to standardize on a single image format. It points out how nasty it can be when you need to convert between one format and another. A particularly relevant quote is...
-
- "As we have seen for the past decade, it is possible to justify the continued use of interlace by demonstrating how a well-optimized end-to-end system performs. If we fine-tune every component of the system for a single format, the results can be quite satisfying--we have fifty years of experience with this approach tweaking on PAL, NTSC and more recently ITU-R601. Interlace works reasonably well when the camera, image processing, image encoding and display are all tuned for a single format."
-
- You are ready to discuss the crux of the matter... what do the horizontal and vertical numbers on your image represent? You say it represents detail... not pixel count? OK.. fine. So, 1080p has 1920x1080 what? Lines of resolution? No... definitely not. Pixels... yes... but then 1080i has 1920x1080 pixels. How do you measure detail? Most video experts use "lines of resolution", where experts view test video on reference monitors. Both vertical and horizontal resolution will be somewhat less than half the pixel resolution for any format. Cite your sources for these measurements. And be sure that you measure 1080p and 1080i using the same criteria. Tvaughan1 14:10, 5 October 2006 (UTC)
- - The numbers 1080i and 1080p are not functioning as numbers in this case, but names. There is nothing else I can call 1080i and still have people know what signal mode I am talking about. The text makes it clear that the size represents visible sharpness, not pixel count. (Have you been asking me to remove the numbers on the top all this time? If so, you need to work on your communication skills.)
- - The other questions you've asked are already answered in my above quotes, which you have ignored.
- - Read your own quote, and what follows, and you will see that the author is saying the opposite of what you claim he is. You have said that an interlaced signal cannot be presented properly on a progressive display. He says that the success of interlace is based on "a well-optimized end-to-end system" without format conversions. He then goes on to show that in the future, format conversions will be inevitable, and progressive has a huge advantage there. Now go to your local appliance store and TRY to find an interlaced HD display. They practically don't exist! Everything is DLP, LCD and Plasma. Everything is progressive! A big high end store near me has over 100 HD displays, and only three of them are interlaced - they are small direct view CRTs. Even Best Buy's web site has only two CRT sets, compared to 80 hits for just LCD: [3] [4]
- -A movie director would be foolish to monitor his program on screens of a type that no one is using. It is best to monitor how people actually see your program.
- -"Both vertical and horizontal resolution will be somewhat less than half the pixel resolution for any format." - Wrong again! DVD's are always advertised as having "500 lines of resolution." What is their horizontal pixel count? 720 pixels. Is 500 "half of" 720? Measuring optical resolution is quite complex, and I'd hoped to sidestep the issue with the graphic by representing a typical result visually rather then getting bogged down in procedural issues. The sources I've provided would easily justify making the 1080i box smaller then the 720p box, but that isn't what people seem to be observing. Algr 18:46, 5 October 2006 (UTC)
I stand by my statement..."Both vertical and horizontal resolution will be somewhat less than half the pixel resolution for any format." This is another topic that is rife with confusion.
I am, as a matter of fact, a DVD expert. I've been involved with DVD encoding, mastering, and manufacturing from the very beginning of the format. I managed the production of some of the first DVDs made in the US. I also have a B.S. in Electrical and Computer Engineering, specializing in Digital Signal Processing. DVDs do not have and can not have "500 lines of resolution". This would violate the Nyquist Theorem. Someone may have come up with some marketing materials that stated this, but if you understand video, you would know that this is not true, nor is it possible to accomplish with a horizontal pixel resolution of 720.
TV resolution is defined by the number of black and white lines that can be clearly resolved per picture height (TV lines per height, or TVL/PH, or simply "TV lines"). This is also known as cycles per height, or CPH, since in the analog domain this black and white signal is a sine wave where the trough is black and the peak is white (luma only). To measure horizontal resolution you display vertical stripes, and to measure vertical resolution you display horizontal stripes. So, a 4:3 aspect ratio signal (NTSC DVD) that has 720 horizontal pixels has 540 pixels of horizontal resolution. The maximum theoretical horizontal resolution of a digital video system is one half of the pixel resolution. In other words, every other pixel would be white, then black, and so on, creating 360 vertical stripes. Per picture height, the theoretical horizontal resolution of DVD is 360 * 3/4 = 270 lines. In the real world, 480i has a horizontal resolution of about 221 CPH (according to Poynton).
In other words, to distinguish a white line, it needs to have black on either side. I've never tried encoding such an image, but I doubt that any MPEG-2 encoder could create a DVD-Video compliant MPEG video stream that could display 360 lines of resolution. I might have to try this some time... but of course Wikipedia is not a place for original research.
Now, I realize that you will be able to find dozens of sources on the Internet stating that DVD has 500 lines of resolution (or more!). But I'm 100% sure that this is wrong, and it can be demonstrated physically and mathmatically. Again... I would highly recommend Charles Poynton's excellent book, "Digital Video and HDTV, algorithms and interfaces" [5], which I referred to for this response (chapter 7 - Resolution, pages 65 - 74). Tvaughan1 20:27, 5 October 2006 (UTC)
This image is not named Detail_chart.svg. Your example images both have the same number of pixels despite differing levels of detail. For purposes of pixel count those images are the same size. The only NPOV way to present this data is with raw pixel counts, period. It doesn't matter whether they are faked in an upconvert or fully utilized from a 4K scan, the number of horizontal pixels in a 1080i stream is either 1440 or 1920 and the number of vertical pixels is 1080. Noclip 21:18, 5 October 2006 (UTC)
- It isn't named PixelCount.jpg either, it is called Resolution chart.svg. Analog NTSC has no pixel structure, and yet it has a resolution. BTW, are you arguing against the numbers, or the size of the boxes? Algr 21:42, 5 October 2006 (UTC)
- I am saying that the resolutions should be measured objectively, not subjectively. The format is not called 810i for a reason. For encyclopedic purposes, this chart should only deal with theoretical perfect conditions. 1080i has 1080 lines per field couple, that's what the specification says, whether the lines are real or fake doesn't matter. As for optical resolution, it should never be used for these purposes in the first place. Noclip 22:27, 5 October 2006 (UTC)
- It IS measured objectively here: [6] 1080p = 972 visible, 1080i = 756 visible, 720p= 648 visible. And it is explained here: [7] Many people consider even 720p to be superior to 1080i. THAT is POV, but absolutely no one other then Tvaughan1 is saying that 1080i is as sharp as 1080p. No one is claiming that 1080i has 810 pixels. But a screen with 810 progressive pixels would look about as sharp as one with 1080 interlaced ones. Here is another refrence who says "An interlace image is two thirds the vertical resolution (measured on a resolution chart) of an equal number of proscan lines." [8] That's three references. What does Tvaughan1 have that says that 1080i is as sharp as 1080p? I've been saying all along - pixel count alone does NOT tell you resolution. Algr 08:28, 6 October 2006 (UTC)
- I am saying that the resolutions should be measured objectively, not subjectively. The format is not called 810i for a reason. For encyclopedic purposes, this chart should only deal with theoretical perfect conditions. 1080i has 1080 lines per field couple, that's what the specification says, whether the lines are real or fake doesn't matter. As for optical resolution, it should never be used for these purposes in the first place. Noclip 22:27, 5 October 2006 (UTC)
[edit] Horizontal Resolution - No Original Research
...clearly shows the "horizontal resolution" of 1080p as 1920x1080. 1080p has a pixel resolution of 1920x1080. So does 1080i... as per the SPMTE 274M specification www.smpte.org/smpte_store/standards/pdf/s274m.pdf.
The image shows 1080i as having a horizontal pixel resolution of 1440. This is plainly misleading, and incorrect. It is the original work of User:Algr, derived from other works and theories. This violates WP:NOR. Let me quote from Wikipedia:No original research...
-
-
- "Wikipedia is not the place for original research. Citing sources and avoiding original research are inextricably linked: the only way to demonstrate that you are not doing original research is to cite reliable sources which provide information that is directly related to the topic of the article and to adhere to what those sources say."
-
-
-
- "Wikipedia is not the place to insert your own opinions, experiences, or arguments — all editors must follow our no original research policy and strive for accuracy.
-
-
-
- Wikipedia has a neutral point of view, which means we strive for articles that advocate no single point of view. Sometimes this requires representing multiple points of view; presenting each point of view accurately; providing context for any given point of view, so that readers understand whose view the point represents; and presenting no one point of view as "the truth" or "the best view". It means citing verifiable, authoritative sources whenever possible, especially on controversial topics."
-
While Algr can cite theories for his ideas, he can cite no source for the specific details of the image (the dimensions of the rectangles and the specific numbers assigned to the "resolution" of each standard). The SPMTE 274M 1920x1080 HDTV specification is authoritative, and verifiable.
There are many factors that can affect the perceived resolution of a video signal.
- The quality of the camera lens.
- The quality of the camera's sensor.
- Motion blur during capture.
- Noise during capture (inadequate lighting, small or poor quality sensor)
- Compression used to store the signal
- Noise or signal loss during editing or format conversion.
- Lossy compression when encoding for transmission
- Signal loss during transmission
- Motion artifacts due to conversion between interlaced and progressive scan (or vice versa)
- Resolution scaling to match the monitor
- Quality of the imaging / display design and components
- Ambient lighting conditions when viewing the display
... just to name a few. All of these factors effect all video capture and display formats. None of these factors should be arbitrarily "factored in" to the native pixel resolution of the format in a wikipedia article or image. Perhaps a separate article on the pros and cons of interlace versus progressive scan would be appropriate.
The same is true for the vertical resolution of each display format. No arbitrary dimensions or numbers should be used. Tvaughan1 19:39, 5 October 2006 (UTC)
[edit] Too many information in one illustration and two basis
hi ho everyone, algr asked for help on my discussion site and I would like to help your, if I can. first of all, I am german and I hope I will understand everything on this site, otherwise please let me know and be indulgent with me ;). concerning the pic. the first time I had seen it I liked it and and checked, if I could use it in the german wikipedia. but I could not manage to get the point, the basic essence out of it. my main concern was about the displaying of the broadcast 1080i in there (which I also explained on Tvaughan1's discussion site). because of this reasons I did not used it in the german wikipedia. I read your facts in here and on my discuission site and I think the reason for me not understanding of the pic was, that there are too many facts in it and a micture of "basics". the one basis is the "count" of pixels, which are defined in ISO, ITU, where ever and the "result of processing". the standards are fixed, that is clear, but the given results on our own tv sets and silver screens are depending on a lot of things, which we can infuence or not. e.g we cannot influence the source, so if the channels uses interlace, we just can hope that our deinterlacer can handle it and displaying a good result. the better the source (more resolution, progressive, no compression, etc) the better result. my solution to your issues for or against this pic could be solved, in my opinion, by rethinking this pic and making yourself clear WHAT massege you want to transport to the person seeing it. do you want to make a comparision of "resolution standards" or a comparision of "results of methods", but do not mix this, I think it would leed the viewer to a wrong decission. split the pic and just use the pure essence for each version and then find a good place in the article and put both back in. and leave the "experts" besides. I know that some channels or some hardware shows a crappy hd image, but its not the fault of the standard, but the fault of the channel with low bitrate or the cost efficient manufacturer of our screens ;). if I wil be needed, feel free to ask me anytime! but you have to settle it, because the en.wiki is not my "home.wiki", althoug I like this one and their wikipedians a lot. greets, --Andreas -horn- Hornig 06:50, 9 October 2006 (UTC)
- Hi Andreas. I agree with you, the resolution chart previously showed different resolutions using different basis. I agree that the basis should be pixel resolution for this chart, and that other discussions of issues that affect resolution are appropriate for other articles. A new image is being used at the moment, with pixel resolution of different television signal standards as the only basis for the image. The discussion continues on Template talk:TV resolution. Tvaughan1 17:28, 9 October 2006 (UTC)