Talk:Frame rate
From Wikipedia, the free encyclopedia
Contents |
[edit] Accuracy
The flicker fusion section is very good and accurate. It conflicts though with an innacruate introductory paragraph for the topic:
The top says, "Frame rate...is the measurement of how quickly an imaging device can produce unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems." We need to change "equally well", find something truthful to allow for the unusual and significant reality of interlaced video, wherein the term "frame" means something different than it does in film.
Here's the issue. It has everything to do with the "look" of video versus film, and is important imformation in regards to conversions between film and video. Traditional interlaced video, NTSC PAL and SECAM (all the major formats still in use, though HD is taking over) has "frames" in video. However, each "frame" actually contains TWO unique consecutive images, each called fields. So the definition above is not true. For instance, PAL has a "frame" rate of 25 "frames" per second. BUT, it has 50 unique images every second. Video is very strange, thanks to interlacing, full resolution only happens on the unmoving parts of images over the span of two fields. So full resolution sometimes resolves at 25 times per second (on unmoving or overlapping parts), but movement always resolves at 50 times per second. Strange, huh? That's why interlaced video looks so hyper-real compared to film.
Film movement is abstract, often considered pleasing for it, because it is on the congitive perceptal edge of discreet images versus animation, between 15 to 25 images every second; but video is on the other cognitive edge, just beyond the brain's the ability to perceive simulated movement versus real world movement, between 50 and 60 images every second. So we have to find a way to address this in this topic. The following statement in the flicker fusion section hints at the basic issue: "...since a conventional television camera will scan the scene again for each field, in many circumstances it may be useful to think of the frame rate as being equal to the field rate."
Cheers! worldpoop.com 20:05, 13 November 2005 (UTC)
[edit] The recent addition
An anonymous editor just added a bunch of good information to the article, but it really could use tighter integration with what was already here rather than simply being tacked-on at the bottom. Some of the new information is redundant and all of it could be better "factored" into the article. Volunteers? If no, I'll eventually get to it.
Atlant 13:01, 23 Apr 2005 (UTC)
The whole thing was pasted in from here whatis.com. I'll revert it. You can still add any facts from there to the current article. --Dtcdthingy 14:46, 23 Apr 2005 (UTC)
[edit] Frame rate conversion
How difficult is it to convert between different frame rates? The article should answer that.
In the particular case of increasing frame rates by 2.5 times, we have 3:2 pulldown. And also the reverse of that. But how about going from 25 or 50 fps to 30 or 60 fps, or the other way around? It's often done, of course, but how?
-- ABostrom 20:24, September 5, 2005 (UTC)
- That whole topic area is called Standards conversion, and encompasses scaling video between resolutions as well. There are lots of different ways to do it: Merging adjacent frames; dropping and doubling frames, intelligent motion compensation; etc etc. It's a big topic. --Dtcdthingy 22:01, 5 September 2005 (UTC)
[edit] 24.976 and 29.97 vs 24 and 30
I have done alot of work with interlaced video, and I am pretty sure that the correct frame rates for interlaced TV are 23.976 24.976 for PAL and 29.97 for NTSC. The numbers 24 and 30 are used for simplicity. I may be wrong about this, please correct me if this is so. If I am not corrected, and if nobody has done it, I will change it tommorow. HighInBC 00:27, 19 March 2006 (UTC)
- Afterthought, since this technical detail is not directly related to the subject matter perhaps putting the word aproximatly should be put infront of it instead of using akward numbers. HighInBC 00:29, 19 March 2006 (UTC)
-
- well No. Wikipedia should use the actual data at least once, then the rest of the article could simply state "approx 24".--Procrastinating@talk2me 11:34, 1 June 2006 (UTC)
-
-
-
-
- (Presumably, you mean 24.976 or some such. Atlant 12:39, 1 June 2006 (UTC))
-
-
- Yes, I meant 24.976. oops. HighInBC 12:58, 1 June 2006 (UTC)
- Though 60i and 50i are accurate aswell(I think) HighInBC 13:00, 1 June 2006 (UTC)
- Well this is further confused by the 24 vs 25. PAL MPEG-2 (PAL DVD) is 25 fps, period. But NTSC MPEG-2 is either 29.97 or 23.976 (with 3:2 pulldown), or about 24. The 3:2 pulldown allows a match with cinema standards and simplify dvd mastering (i assume). The ~30 (vs 25) is originally derived from the differing power standards in different countries (e.g. 50Hz vs 60Hz current). NTSC used to be exactly 60i (vs current approx 59.94, or to be exact, 60/1.001 which is 59.94059...), half of which is (about) 29.97. But all this stuff is written about extensively elsewhere on Wiki, so I think this page should (a) be technically correct and (b) have some suitable links to more details. I will adjust accordingly. --Psm 19:50, 1 August 2006 (UTC)
[edit] Another explanation for "choppiness"
Should something like this be appended to (or replace) the discussion of "choppiness?"
Choppiness can also occur if the rendering rate is not the same as the monitor's frame rate. For example, assume the video card is redrawing a scene depicting a smoothly moving object 65 times per second, and the monitor's refresh rate is 60Hz. Every 13th frame will be dropped, resulting in the object appearing to jerk forward 5 times per second. Limiting the redraw rate to the refresh rate helps to eliminate this source of "chop."
However, the graphics card redraw rate will fall below the monitor's refresh rate if the scene is complex enough, raising the possibility of choppy motion again. This is where double buffering and triple buffering come into play. Display output is delayed by one or more frames. Thus the graphics subsystem can draw one or more frames in advance, so if extra time is needed to draw the frame it is still ready when it should be displayed. The rendering software can then skip one or more frames and render to depict a time 2 (or more) frames later. (It's OK to leave a frame on the display for two frame periods as long as the new placement is correct; this is quite different than the 65Hz rendering example.) In 3D games or simulations, this introduces some "lag" because user input is not applied instantly, but one or more frames from now. As long as this lag is fairly constant and relatively small, the user can adapt to it without really noticing. Shyland 09:49, 22 March 2006 (UTC)
you can add the jerk part on in a few sentences. also that triple buffering means 3 frames of lag from input. keep it short, theyre just details.
-
-
-
- This is great, put it in as a new section ! :) Procrastinating@talk2me
-
-
- Triple buffering doesn't mean 3 frames of lag. It means 2 frames (1/30 sec at 60Hz, less at higher refresh rates.) One buffer holds the frame currently shown; it's the other two that hold advance frames and produce delay. Furthermore if your mouse/button input is ALWAYS 1/30 second(!) behind, you can easily adapt to such a small delay. It's unpredictable lag (i.e. not using double/triple buffering) that screws up your timing. --Shyland 19:50, 10 October 2006 (UTC)
[edit] Just wondering.
What's the frame rate of reality? What will happened when games get a higher framerate then real life? --Planetary 03:28, 12 September 2006 (UTC)
- Your brain isn't a synchronous computer; it has no central clock from which to measure its "frame rate".
- Atlant 13:57, 12 September 2006 (UTC)
-
- Thanks for the info. I wasn't expecting an answer like that. :)--Planetary 04:04, 26 September 2006 (UTC)
- Fun idea...Reality isn't a synchronous computer either. It has no framerate; or infinite framerate, if you like. Frames are just our crude way of approximating reality. Higher frame rates only get closer to simulating reality's utter smoothness. Above a certain rate it stops mattering because you can't see the difference. At that point the display system exceeds the capabilities of the input system (your eyes and brain), so there's no real point in any further frame-rate improvements. --Shyland 20:33, 10 October 2006 (UTC)
-
- Yeah, ti would be funny if there was a maximum frame rate of real life. Sort of a stand-up comedy time gag.--Planetary 00:04, 11 October 2006 (UTC)
-
-
- Atlant 00:41, 11 October 2006 (UTC)
-
-
-
-
- Interesting. Thanks for the link. Looks like this tale hasn't been told...--Planetary 00:48, 11 October 2006 (UTC)
-
-
[edit] "Extra" frames are not always dropped
The solution to this problem would be to interpolate the extra frames together in the back-buffer (field multisampling), or simulate the motion blur seen by the human eye in the rendering engine. Currently most video cards can only output a maximum frame rate equal to the refresh rate of the monitor. All extra frames are dropped.
I've heard of triple buffering, but with double buffering usually at least some of each rendered frame makes it (with vsync nothing but potential processing power can go waste; otherwise a new frame can appear if another wasn't completely sent to the monitor yet, resulting in parts of each being shown). I think the last claim should be either adjusted or removed. --62.194.128.232 02:33, 26 September 2006 (UTC)