Talk:Flicker fusion threshold
From Wikipedia, the free encyclopedia
[edit] request for clarification
in the article it says
-
- "In actual practice, movies are recorded at 24 frames per second, and TV cameras operate at 25 or 30 frames per second, depending on the TV system used. Even though motion may seem to be continuous at 25 or 30 fps, the brightness may still seem to flicker objectionably. By showing each frame twice in cinema projection (48 Hz), and using interlace in television (50 or 60 Hz), a reasonable margin for error or unusual viewing conditions is achieved in minimising subjective flicker effects."
Questions:
- if the human threshold is preceived as 16, who would someone use a 24 frame rate?
- what is the difference between chemical film and digital imagery to make it leap from 24-->30 ?
- if you show each of the 24 fps twice (each frame for the same duration as noraml 24 fps) then you are actually showing the film more slowly, being half the speed! (each frames being static on the creen for tweice the time) this makes no sense.
thank you. --Procrastinating@talk2me 14:33, 30 May 2006 (UTC)
Answers: In response to your last question, you must understand the concept of a shutter. A cinema film projector shutter operates at 48 Hz, meaning an image flashes on screen 48 times in a second. Since there are 24 frames a second being projected from the film, each frame is flashed onto the screen twice in a row, giving the illusion that there are 48 different frames being projected, even if half the frames are the same. It's just to trick the mind. 24 frames being shown per second is actually very jerky. 48 is more asthetically pleasing.
In response to your second to last question, 30 fps versus 24 fps dates back to the 1930's when the NTSC standard was set by the National Association of Broadcasters. It has to do with the analog bandwidth allowed by the technology of that time. Digital has nothing to do with it.
- To expand on this point, NTSC's 30 fps rate (actually 29.97 fps) was based on the 60 hz alternating current of North American electricity systems. The AC provides a built in oscillation that set the field rate for NTSC (60 half frame fields per second = 30 full frames per second). Europe, which had a 50 hz AC system, chose 25 fps television for a similar reason.--NWoolridge 13:48, 6 June 2006 (UTC)
To the first question: why would one want to make a film at 16 fps? While a human may be able to perceive it as fluid motion, it may be very uncomfortable to watch. Filmmakers wanted it low so they could use more film stock and not run out as fast (today, a Panavision film stock lasts only 3 minutes, even at 24--imagine how little it would last at 30 or higher!) But they discovered that as you approach numbers below 24, it became uncomfortable to watch. 24 became the standard.
- There are two concepts being conflated in the article and we should probably try to disentangle them. One concept is how often a light has to flash before humans no longer can perceive any flickering of the light. That number varies a lot depending upon at least the following factors:
- The particular subject you're testing (you and I probably vary in our sensitivity to flicker)
- How fatigued the subject is (flicker sensitivity goes up as you get more tired)
- No, it goes down.
- How bright the light source is (flicker is seen more easily with brighter sources)
- Whether the subject is viewing the flicker with their central or peripheral vision (peripheral vision is much more sensitive to flicker)
- Many folks can see flicker up to about 75Hz under typical "viewing the computer monitor" conditions. Monitors tend to be bright, viewed at least partially with your peripheral vision, and used by tired users, so they tend to get the worst of all worlds, leading people to choose refresh (flicker) rates of 75 or even 85 Hz. Television, by comparison, tends to be viewed with your central vision so 60Hz was deemed to be "good enough" and there were other good technical reasons to choose 50Hz in parts of the world with 50Hz power, even though nearly everyone can perceive the 50Hz flicker.
- The other concept is how many discrete images must be presented each second before we perceive them as representing continuous motion. This number is a lot lower than flickers/second. Most people see 24 frame/second movies as being "continuous" and many cartoons were only drawn at 12 frames/second. TV, at 25 or 30 half-frames per second is seen as fine, and if you've ever seen 50 frames/second movies, you'd say "Wow! That's great!*" And even though a movie is only showing you 24 discrete images per second, the projector shutter is arranged so the light source flashes 2 or 3 times per image, leading to a 48 or 72 Hz flicker rate.
- So maybe we need to re-edit the article to make this all clear.
- What does everyone else think?
- Atlant 15:54, 30 May 2006 (UTC)
- *This leads to a third confounding factor. Computer gamers tend to talk about "frame rates" as a bragging point of how studly their computer hardware is. It's perceived as "better" if their computer hardware can generate, say, 150 frames/second rather than 37 frames per second and, up to a point, they're correct: as with cartoons, movies, and TV, more discrete images per second leads you to perceive motion as being smoother. But there's a technological limit: once they exceed the refresh (flicker) rate of their monitors, the additional frames they're generating aren't even displayed, so all those "go fasts" just go to waste. If the monitor is running at 85 Hz, there's no point generating more than 85 discrete images per second because it just means portions of the additional images will be thrown away.
- Even if entire frames are never displayed, there still are reasons why very high framerates are desireable. Most of them could be seen to be based in suboptimal to bad programming, e.g. the various "magic framerates" in id software engines, or the higher responsiveness as higher framerates. Running at a consistent 150fps or even 333fps can give you a competitive edge in some games; it isn't necessarily done for bragging rights.
[edit] great reply, please revise this article
Thank you Atlant and others for your wonderfull reply and putting things in order! It is SO Much easier to read them annotated by points instaed of the clumzy ambiguated text present now in the article. please do feel free to add your information, nad possibly rewrite those sections. I still do not understand
- why cartoons use such a low rate,
- why are we using peripheral vision in computer screens (which we seat clsoer to),
- why is 48Hz shutter makes a moother image, while outputting a 24 fps film all it does is to have more "darken" images from the closing time periods of the shutter.
I've put a re-write tag on the article,. Procrastinating@talk2me
- Answers:
-
- Cartoons: Remember, every frame needed to be drawn, so doubling the frame rate (roughly) doubled the labor costs, at least in the days prior to computer animation. 12 frames/second was "good enough" to be an acceptable trade-off between the cost and the visual appearance of the finished product. Nowadays, the cmputers can certainly tween between the hand-drawn animation frames.
-
- By sitting closer to the computer screen, portions of the screen fall into your "peripheral vision" rather than your central (foveal) vision. The parts of the screen that fall into your peripheral vision are seen to flicker more than the parts of the screen observed by your central vision. I've always assumed this was because, in your retina, the rod cells (more-common at the periphery of your retina) are more sensitive to flicker than the cone cells (more-common in your central-vision area).
-
- 48/72 Hz is actually seen as two/three flashes per frame so it doesn't stimulate your "flicker" response as much. And your brain doesn't really notice that it's the same image two or three times in a row. Meanwhile, modern projectors have enough brightness that they can afford to waste the light that doesn't come through druing those periods when the shutter is closed.
- Atlant 14:15, 12 June 2006 (UTC)
- Meanwhile, modern projectors have enough brightness that they can afford to waste the light that doesn't come through druing those periods when the shutter is closed.
-
- Why don't they just leave each frame on for a longer percentage of time? — Omegatron 19:13, 12 December 2006 (UTC)
-
- Nevermind. It's because we're talking about film. :-) — Omegatron 19:14, 12 December 2006 (UTC)