Frame rate

From Wikipedia, the free encyclopedia

Frame rate, or frame frequency, is the measurement of how quickly an imaging device produces unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems. Frame rate is most often expressed in frames per second or simply, hertz (Hz).

Contents

[edit] Flicker fusion frequency

According to RyanBridgers.net; the human eye sees at 21 frames per second... The frame rate is related to but not identical to a physiological concept called the flicker fusion threshold or flicker fusion rate. Light that is pulsating below this rate is perceived by humans as flickering; light that is pulsating above this rate is perceived by humans as being continuous. The exact rate varies depending upon the person, their level of fatigue, the brightness of the light source, and the area of the retina that is being used to observe the light source. Few people perceive flicker above about 75 hertz.

These rates would be impractical for the actual frame rate of most film mechanisms so the shutter in the projection devices is actually arranged to interrupt the light two or three times for every film frame. In this fashion, the common frame rate of 24 fps (frames per second) produces 48 or 72 pulses of light per second, the latter rate being above the flicker fusion rate for most people most of the time.

Video systems frequently use a more complex approach referred to as interlaced video. Broadcast television systems such as NTSC, PAL, and SECAM produce an image using two passes called fields. Each field contains half of the lines in a complete frame (the odd-numbered lines or the even-numbered lines). Thus, while only using the bandwidth of 25 or 30 complete frames per second, they achieve a flicker fusion frequency of 50 or 60 Hz, at the expense of some vertical judder and additional system complexity. The "frame rate" of interlaced systems is usually defined as the number of complete frames (pairs of fields) transmitted each second (25 or 30 in most broadcast systems). However, since a conventional television camera will scan the scene again for each field, in many circumstances it may be useful to think of the frame rate as being equal to the field rate.

In contrast to televisions, computer monitors generally use progressive scan, and therefore internet video formats generally do also. The "P" versions of HDTV (i.e., 720p or 1080p) also support progressive scan, as do modern DVD players.

[edit] References

[edit] Frame rates in film and television

There are three main frame rate standards in the TV and movie-making business.

  • 60i (interlaced; 50i= 25 frames in Europe and Australia) is the standard video field rate per second (60 interlaced fields = 29.97 frames) that has been used for television for decades, whether from a broadcast signal, rented DVD, or home camcorder.
  • 30p, or 30-frame progressive, is a noninterlaced format and produces video at 30 frames per second. Progressive (noninterlaced) scanning mimics a film camera's frame-by-frame image capture and gives clarity for high speed subjects and a cinematic-like appearance. Shooting in 30p mode offers video with no interlace artifacts. This frame rate originated in the 1980s in the music video industry. [citation needed]
  • The 24p frame rate is also a noninterlaced format, and is now widely adopted by those planning on transferring a video signal to film. But film- and video-makers turn to 24p for the "cine"-look even if their productions are not going to be transferred to film, simply because of the "look" of the frame rate.

35 mm movie cameras use a standard exposure rate of 24 frames per second.

[edit] Computer science

Frame rate is also a term used in real-time computer systems. In a fashion somewhat analogous to the moving-picture definition presented above, a real-time frame is the time it takes to complete a full round of the system's processing tasks. If the frame rate of a real-time system is 60 hertz, the system reevaluates all necessary inputs and updates the necessary outputs 60 times per second under all circumstances.

The designed frame rates of real-time systems vary depending on the equipment. For a system that is steering an oil tanker, a frame rate of 1 Hz may be sufficient. For a real-time system steering a guided missile, a frame rate of 100 Hz may not be adequate. The designer must choose a frame rate appropriate to the application's requirements.

[edit] Frame rates in video games

Frame rates are considered important in video games. The frame rate can make the difference between a game that is playable and one that is not. The first 3D first-person adventure game for a personal computer, 3D Monster Maze, had a frame rate of approximately 6 fps, and was still a success, being playable and addictive. In modern action-oriented games where players must visually track animated objects and react quickly, frame rates of approximately 25 to 30 fps are considered minimally acceptable.

A culture of competition has arisen among game enthusiasts with regards to frame rates, with players striving to obtain the highest fps count possible. Indeed, many benchmarks released by the marketing departments of hardware manufacturers and published in hardware reviews focus on the fps measurement. Modern video cards, often featuring NVIDIA or ATI chipsets, can perform at over 160 fps on intensive games such as F.E.A.R. This does not apply to all games - some games apply a limit on the frame rate. For example, in the Grand Theft Auto series, Grand Theft Auto III and Grand Theft Auto: Vice City have a standard 30 fps (Grand Theft Auto: San Andreas runs at 25 fps) and this limit can only be removed at the cost of graphical and gameplay stability. It is also doubtful whether striving for such high frame rates is worthwhile. An average 17" monitor can reach 85 Hz, meaning that any performance reached by the game over 85 fps is discarded. For that reason it is not uncommon to limit the frame rate to the refresh rate of the monitor in a process called vertical synchronization. However, many players feel that NOT synchronizing every frame produces better in-game performance, at the cost of some "tearing" of the images.

It should also be noted that there is a rather large controversy over what is known as the "feel" of the game frame rate. It is argued that games with extremely high frame rates "feel" better and smoother than those that are just getting by. This is especially true in games such as a first-person shooter. There is often a noticeable choppiness perceived in most computer rendered video, despite it being above the flicker fusion frequency.

This choppiness is not a perceived flicker, but a perceived gap between the object in motion and its afterimage left in the eye from the last frame. A computer samples one point in time, then nothing is sampled until the next frame is rendered, so a visible gap can be seen between the moving object and its afterimage in the eye. Many driving games have this problem, like NASCAR 2005: Chase For the Cup for XBox, and Gran Turismo 4. The polygon count in a frame may be too much to keep the game running smoothly for a second. The power needs to go to the polyon count and usually takes away the power from the framerate.

The reason computer rendered video has a noticeable afterimage separation problem and camera captured video does not is that a camera shutter interrupts the light two or three times for every film frame, thus exposing the film to 2 or 3 samples at different points in time. The light can also enter for the entire time the shutter is open, thus exposing the film to a continuous sample over this time. These multiple samples are naturally interpolated together on the same frame. This leads to a small amount of motion blur between one frame and the next which allows them to smoothly transition.

An example of afterimage separation can be seen when taking a quick 180 degree turn in a game in only 1 second. A still object in the game would render 60 times evenly on that 180 degree arc (at 60 Hz frame rate), and visibly this would separate the object and its afterimage by 3 degrees. A small object and its afterimage 3 degrees apart are quite noticeably separated on screen.

The solution to this problem would be to interpolate the extra frames together in the back-buffer (field multisampling), or simulate the motion blur seen by the human eye in the rendering engine. Currently most video cards can only output a maximum frame rate equal to the refresh rate of the monitor. All extra frames are dropped.

High frame rates are also for creating performance "reserves" as certain elements of a game may be more GPU-intensive than others. While a game may achieve a fairly consistent 60 fps, the frame rate may drop below that during intensive scenes. A higher rendering frame rate may prevent a drop in screen frame rate.

Your eyes can see above 60 fps in some cases such as video games where the frame rate is a bit different.

[edit] See also

[edit] External links