Reyes rendering

From Wikipedia, the free encyclopedia

Reyes rendering pipeline
Enlarge
Reyes rendering pipeline

Reyes rendering is a computer software architecture used in 3D computer graphics to render photo-realistic images. It was developed in the mid-1980s by Lucasfilm's Computer Graphics Research Group, which is now Pixar. It was first used in 1982 to render images for the Genesis effect sequence in the movie Star Trek II: The Wrath Of Khan. Pixar's PhotoRealistic RenderMan is one implementation of the Reyes algorithm. According to the original paper describing the algorithm the Reyes image renering system is "An architecture ... for fast high-quality rendering of complex images." Reyes was proposed as a collection of algorithms and data processing systems. However the terms "algorithm" and "architecture" have come to be used synonymously and are used interchangeably in this article.

Reyes is an acronym for Renders Everything You Ever Saw (the name is also a pun on Point Reyes, California, near where Lucasfilm was located) and is suggestive of processes connected with optical imaging systems.

The architecture was designed with a number of goals in mind:

Model Complexity/Diversity: In order to generate visually complex and rich images users of a rendering system need to be free to model large numbers (100,000s) of complex geomtric structures possibly generated using procedural models such as fractals and particle systems.

Shading Complexity: Much of the visual complexity in a scene is generated by the way in which light rays interacts with solid object surfaces. Generally in computer graphics this is modelled using textures. Textures can be colored arrays of pixels, describe surface displacements or transparency or surface reflectivity. Reyes allows users to incoroporate procedural shaders whereby surface structure and optical interaction is acheived using computer programmes implenting procedural algorithms rather than simple look-up tables. A good portion of the algorithm is aimed at minimising the time spent by processors fetching textures from data stores.

Minimal Ray Tracing: At the time that Reyes was proposed, computer systems were not capable of ray tracing scenes of photo-realistic complexity. The time required to ray trace a scene increases exponentially with scene complexity and as such techniques and algorithms, such a Reyes, which manipulate geomtric primitives one at a time rather than en-mass will run faster with realistic scene descriptions.

Speed: Rendering a 2 hour movie at 24 frames per second in one year allows 3 minutes rendering time per frame, on average.

Image Quality: Any image generated algorithm related artefacts are considered unacceptable.

Flexibility: The architecture should be flexible enough to incorporate new technqiues as they become available, without the need for a complete reimplementation of the algorithm.

Reyes efficiently achieves several effects that were deemed necessary for film-quality rendering: smooth curved surfaces, surface texturing, motion blur, and depth of field.

Reyes renders curved surfaces, such as those represented by parametric patches, by dividing them into micropolygons, small quadrilaterals each less than one pixel in size. Although many micropolygons are necessary to approximate curved surfaces accurately, they can be processed with simple, parallelizable operations. A Reyes renderer tessellates high-level primitives into micropolygons on demand, dividing each primitive only as finely as necessary to appear smooth in the final image.

Next, a shader system assigns a color and opacity to each micropolygon. Most Reyes renderers allow users to supply arbitrary lighting and texturing functions written in a shading language. Micropolygons are processed in large grids which allow computations to be vectorized.

Shaded micropolygons are sampled in screen space to produce the output image. Reyes employs an innovative hidden-surface algorithm or hider which performs the necessary integrations for motion blur and depth of field without requiring more geometry or shading samples than an unblurred render would need. The hider accumulates micropolygon colors at each pixel across time and lens position using a Monte Carlo method called stochastic sampling.

The basic Reyes pipeline has the following steps:

  1. Bound. Calculate the bounding volume of each geometric primitive.
  2. Split. Split large primitives into smaller, diceable primitives.
  3. Dice. Convert the primitive into a grid of micropolygons, each approximately the size of a pixel.
  4. Shade. Calculate lighting and shading at each vertex of the micropolygon grid.
  5. Bust the grid into individual micropolygons, each of which is bounded and checked for visibility.
  6. Hide. Sample the micropolygons, producing the final 2D image.

In this design, the renderer must store the entire frame buffer in memory since the final image cannot be output until all primitives have been processed. A common memory optimization introduces a step called bucketing prior to the dicing step. The output image is divided into a coarse grid of "buckets," each typically 16 by 16 pixels in size. The objects are then split roughly along the bucket boundaries and placed into buckets based on their location. Each bucket is diced and drawn individually, and the data from the previous bucket is discarded before the next bucket is processed. In this way only a frame buffer for the current bucket and the high-level descriptions of all geometric primitives must be maintained in memory. For typical scenes, this leads to a significant reduction in memory usage compared to the unmodified Reyes algorithm.

[edit] Reyes renderers

The following renderers use the Reyes algorithm in one way or the other or at least allow users to select it to produce their images:

[edit] References

[edit] External links