Optical flow

From Wikipedia, the free encyclopedia

Figure 1: The optical flow vector of a moving object in a video sequence.
Figure 1: The optical flow vector of a moving object in a video sequence.

Optical flow or optic flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer (an eye or a camera) and the scene.[1][2]Optical flow techniques such as motion detection, object segmentation, time-to-collision and focus of expansion calculations, motion compensated encoding, and stereo disparity measurement utilize this motion of the objects surfaces, and edges.[3] [4]

The application of optical flow includes the problem of inferring not only the motion of the observer and objects in the scene, but also the structure of objects and the environment. Since awareness of motion and the generation of mental maps of the structure of our environment are critical components of animal (and human) vision, the conversion of this innate ability to a computer capability is similarly crucial in the field of machine vision.[5]

Contents

[edit] Estimation of the optical flow and its usages

Two-dimensional image motion is the projection of the three-dimensional motion of objects, relative to the observer, onto its image plane. Sequences of ordered images allow the estimation of projected two-dimensional image motion as either instantaneous image velocities or discrete image displacements. These are usually called the optical flow field or the image velocity field.[6]

Fleet and Weiss provide a tutorial introduction to gradient based optical flow .[7] John L. Barron, David J. Fleet, and Steven Beauchimen provides a performance analysis of a number of optical flow techniques. It emphasizes the accuracy and density of measurements.[8]

Optical flow was used by robotics researchers in many areas such as: object detection and tracking, image dominant plane extraction, movement detection, robot navigation and visual odometry.[9]

[edit] Some methods for determining optical flow

  • Phase correlation (inverse of normalized cross-power spectrum)
  • Block-based methods (minimizing sum of squared differences or sum of absolute differences, or maximizing normalized cross-correlation)
  • Differential methods of estimating optical flow, based on partial derivatives of the image signal and/or the sought flow field and higher-order partial derivatives, such as:
    • Lucas–Kanade method (regarding image patches and an affine model for the flow field)
    • Horn–Schunck method (optimizing a functional based on residuals from the brightness constancy constraint, and a particular regularization term expressing the expected smoothness of the flow field)
    • Buxton-Buxton method (based on a model of the motion of edges in image sequences)
    • Black-Jepson method (determines coarse optical flow via correlation method.[10])
    • general variational methods (a range of modifications/extensions of Horn–Schunck, using other data terms and other smoothness terms)

[edit] Optical flow and motion estimation

Motion estimation and video compression have developed as a major aspect of optical flow research. While the optical flow field is superficially similar to a dense motion field derived from the techniques of motion estimation, optical flow is the study of not only the determination of the optical flow field itself, but also of its use in estimating the three-dimensional nature and structure of the scene, as well as the 3D motion of objects and the observer relative to the scene.

Consider Figure 1 as an example. Motion estimation techniques can determine that on a two dimensional plane the ball is moving up and to the right and vectors describing this motion can be extracted from the sequence of frames. For the purposes of video compression (e.g., MPEG), the sequence is now described as well as it needs to be. However, in the field of machine vision, the question of whether the ball is moving to the right or if the observer is moving to the left is unknowable yet critical information. If a static, patterned background were present in the five frames, we could confidently state that the sphere is moving to the right.

[edit] References

  1. ^ Andrew Burton and John Radford (1978). Thinking in Perspective: Critical Essays in the Study of Thought Processes. Routledge. 
  2. ^ David H. Warren and Edward R. Strelow (1985). Electronic Spatial Sensing for the Blind: Contributions from Perception. Springer. ISBN 9024726891. 
  3. ^ Kelson R. T. Aires, Andre M. Santana, Adelardo A. D. Medeiros (2008). Optical Flow Using Color Information. ACM New York, NY, USA. ISBN 978-1-59593-753-7. 
  4. ^ S. S. Beauchemin , J. L. Barron (1995). The computation of optical flow. ACM New York, USA. 
  5. ^ Christopher M. Brown (1987). Advances in Computer Vision. Lawrence Erlbaum Associates. 
  6. ^ S. S. Beauchemin , J. L. Barron (1995). The computation of optical flow. ACM New York, USA. 
  7. ^ David J. Fleet and Yair Wiess (2006). "Optical Flow Estimation", in Paragios et al.: Handbook of Computer Vision. Springer. 
  8. ^ John L. Barron, David J. Fleet, and Steven Beauchimen (1994). "Performance of optical flow techniques". International Journal of Computer Vision. Springer. 
  9. ^ Kelson R. T. Aires, Andre M. Santana, Adelardo A. D. Medeiros (2008). Optical Flow Using Color Information. ACM New York, NY, USA. ISBN 978-1-59593-753-7. 
  10. ^ S. S. Beauchemin , J. L. Barron (1995). The computation of optical flow. ACM New York, USA. 

[edit] External links