Frame rate

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Frame rate, or frame frequency, is the frequency (rate) at which an imaging device produces unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems. Frame rate is most often expressed in frames per second (FPS) and in progressive scan monitors as hertz (Hz).

Contents

[edit] Frame rates in film and television

There are three main frame rate standards in the TV and movie-making business.

  • 60i (actually 59.94, or 60 x 1000/1001 to be more precise; 60 interlaced fields = 29.97 frames) is the standard video field rate per second, whether from a broadcast signal, DVD, or home camcorder. This interlaced field rate was developed separately by Farnsworth and Zworykin in 1934,[1] and was part of the NTSC television standards effective in 1941. When NTSC color was introduced in 1953, the older rate of 60 fields per second was reduced by a factor of 1000/1001 to avoid interference between the chroma subcarrier and the broadcast sound carrier.
  • 50i (50 interlaced fields = 25 frames) is the standard video field rate per second for PAL and SECAM television.
  • 30p, or 30-frame progressive, is a noninterlaced format and produces video at 30 frames per second. Progressive (noninterlaced) scanning mimics a film camera's frame-by-frame image capture and gives clarity for high speed subjects and a cinematic-like appearance. Shooting in 30p mode offers video with no interlace artifacts. The widescreen film process Todd-AO used this frame rate in 1954–1956.[2]
  • The 24p frame rate is also a noninterlaced format, and is now widely adopted by those planning on transferring a video signal to film. But film- and video-makers turn to 24p for the "cine"-look even if their productions are not going to be transferred to film, simply because of the "look" of the frame rate. When transferred to NTSC television, the rate is effectively slowed to 23.976 frame/s, and when transferred to PAL or SECAM it is sped up to 25 frame/s. 35 mm movie cameras use a standard exposure rate of 24 frames per second, though many cameras offer rates of 23.976 frame/s for NTSC television and 25 frame/s for PAL/SECAM. The 24 frame/s rate became the de facto standard for sound motion pictures in the mid-1920s.[3]
  • 25p is a video format which runs twenty-five progressive frames per second. This framerate is derived from the PAL television standard of 50i (or 50 interlaced fields per second). While 25p captures only half the motion that normal 50i PAL registers, it yields a higher vertical resolution on moving subjects. It is also better suited to progressive-scan output (e.g., on LCD displays, computer monitors and projectors) because the interlacing is absent. Like 24p, 25p is often used to achieve "cine"-look.
  • 50p and 60p is a progressive format used in high-end HDTV systems. While it is not technically part of the ATSC or DVB broadcast standards, it is rapidly gaining ground in the areas of set-top boxes and video recordings.[citation needed]
  • 72p is currently an experimental progressive scan format. Major institutions such as Snell & Wilcox have demonstrated 720p72 pictures as a result of earlier analogue experiments, where 768 line television at 75 Hz looked subjectively better than 1150 line 50 Hz progressive pictures with higher shutter speeds available (and a corresponding lower data rate).[4] Modern TV cameras such as the Red, can use this frame rate for creative effects such as slow motion (replaying at 24 fps). 72fps was also the frame rate at which emotional impact peaked[5] to the viewer as measured by Douglas Trumbull that led to the Showscan film format.

Even higher frame rates (~300 Hz) have been tested by BBC R&D from concerns over sports and other broadcasts where fast motion with large HD displays could have an effect with viewers.[6] 300 fps can be converted to both 50 and 60 Hz transmission formats without major issues.

Owing to their flexibility, software-based video formats can specify arbitrarily high frame rates, and many consumer PC monitors operate at hundreds of frames per second, depending on selected video mode.

[edit] Computing

Frame rate is also a term used in real-time computing. In a fashion somewhat comparable to the moving-picture definition presented above, a real-time frame is the time it takes to complete a full round of the system's processing tasks. If the frame rate of a real-time system is 60 hertz, the system reevaluates all necessary inputs and updates the necessary outputs 60 times per second under all circumstances.

The designed frame rates of real-time systems vary depending on the equipment. For a real-time system that is steering an oil tanker, a frame rate of 1 Hz may be sufficient, while a rate of even 100 Hz may not be adequate for steering a guided missile. The designer must choose a frame rate appropriate to the application's requirements.

[edit] Frame rates in video games

Frame rates in video games refer to the speed at which the image is refreshed (typically in frames per second, or FPS.) Many underlying processes such as collision detection and network processing often run at different or inconsistent frequencies and/or in different physical components of a computer. FPS affect the experience in two ways; low FPS will not be able to give the illusion of motion effectively and will affect the user's capacity to interact with the game, while FPS which vary substantially from one second to the next depending on the computational severity of the simulation will produce uneven “choppy” animation. Many games lock their framerate at lower but more sustainable levels to give consistently smooth motion.

The first 3D first-person shooter game for a personal computer, 3D Monster Maze, had a frame rate of approximately 6 FPS, and was still a success. In modern action-oriented games where players must visually track animated objects and react quickly, frame rates of between 30 to 100+ FPS are considered acceptable by most, though this can vary significantly from game to game. Modern action games, including popular console shooters such as Halo 3, are locked at 30 FPS maximum, while others, such as Unreal Tournament 3, can run well in excess of 100 FPS on sufficient hardware. The frame rate within games varies considerably depending upon what is currently happening at a given moment, or with the hardware configuration (especially in PC games.) When the computation of a frame consumes more time than is alloted between frames, the framerate decreases.

A culture of competition has arisen among game enthusiasts with regards to frame rates, with players striving to obtain the highest FPS possible, due to their utility in demonstrating a system's power and efficiency. Indeed, many benchmarks (such as 3DMark) released by the marketing departments of hardware manufacturers and published in hardware reviews focus on the FPS measurement. Even though the typical LCD monitors of today are locked at 60 FPS, making extremely high framerates impossible to see in realtime, playthroughs of game “timedemos” at hundreds or thousands of FPS for benchmarking purposes are still common.

Beyond measurement and bragging rights, such exercises do have practical bearing in some cases. A certain amount of discarded “headroom” frames are beneficial for the elimination of uneven (“choppy” or “jumpy”) output, and to prevent FPS from plummeting during the intense sequences when players need smooth feedback most.

Aside from framerate, a separate but related factor unique to interactive applications such as gaming is latency. Excessive preprocessing can result in a noticeable delay between player commands and computer feedback, even when a full framerate is maintained, often referred to as input lag.

Without realistic motion blurring motions in video games and computer animations would not look as fluid as on film even with the same frame rate. When a fast moving object is present on two consecutive frames there is inevitably a gap between the images on the two frames which can contribute to a noticeable separation of the object and its afterimage left in the eye. Motion blurring helps to mitigate this effect since it tends to reduce this image gap when the two frames are strung together (the effect of motion blurring is essentially superimposing multiple images of the fast-moving object on a single frame). The result is that the motion becomes more fluid to the human eye even as the image of the object becomes blurry on each individual frame.

A high framerate still doesn't guarantee fluid movements, especially on hardware with more than one GPU. The Effect is known as micro stuttering.

[edit] How many frames per second can the human eye see?

The human visual system does not see in terms of frames; it works with a continuous flow of light/information.[citation needed] A related question is, “how many frames per second are needed for an observer to not see artifacts?” However, this question too does not have a single straightforward answer. If the image is switching between black and white each frame, then this image will appear to flicker when the pattern is shown at rates slower than 100 frames per second. In other words, the flicker-fusion point, where the eyes see gray instead of flickering tends to be around 60 Hz. However, for fast moving objects, frame rates may need to be even higher to avoid judder (non-smooth motion) artifacts. And the retinal fusion point can vary in different people, as well as depending on lighting conditions.

Although human vision has no "frame rate", it may be possible to investigate the consequences of changes in frame rate for human observers. The most famous example may be the wagon-wheel effect, a form of aliasing in time, where a spinning wheel suddenly appears to change direction when its speed approaches the frame rate of the image capture/reproduction system.

Different capture/playback systems may operate at the same frame rate, and still give a different level of "realism" or artifacts attributed to frame rate. One reason for this may be the temporal characteristics of the camera and display device.

[edit] See also

[edit] References

[edit] External links