Color image pipeline
An image pipeline or video pipeline is the set of components commonly used between an image source (such as a camera, a scanner, or the rendering engine in a computer game), and an image renderer (such as a television set, a computer screen, a computer printer or cinema screen), or for performing any intermediate digital image processing consisting of two or more separate processing blocks. An image/video pipeline may be implemented as computer software, in a digital signal processor, on an FPGA, or as fixed-function ASIC. In addition, analog circuits can be used to do many of the same functions.
Typical components include image sensor corrections (including "debaying" or applying a Bayer filter), noise reduction, image scaling, gamma correction, image enhancement, colorspace conversion (between formats such as RGB, YUV or YCbCr), chroma subsampling, framerate conversion, image compression/video compression (such as JPEG), and computer data storage/data transmission.
Typical goals of an imaging pipeline may be perceptually pleasing end-results, colorimetric precision, a high degree of flexibility, low cost/low CPU utilization/long battery life, or reduction in bandwidth/file size.
Some functions may be algorithmically linear. Mathematically, those elements can be connected in any order without changing the end-result. As digital computers use a finite approximation to numerical computing, this is in practice not true. Other elements may be non-linear or time-variant. For both cases, there is often one or a few sequences of components that makes sense for optimum precision as well as minimum hardware-cost/CPU-load.[1]
The figure shows a simplified, typical use of two imaging pipelines. The upper half shows components that might be found in a digital camera. The lower half shows components that might be used in an image viewing application on a computer for displaying the images produced by the camera.
Note that operations mimicking physical, linear behaviour, such as image scaling, is ideally carried out in the left hand side, working on linear RGB signals. Operations that are to appear "perceptually uniform", such as lossy image compression, on the other hand, should be carried out in the right hand side, working on "gamma-corrected" r'g'b or Y'CbCr signals.
See also
- Image processing engine