In a computer system, a blitter is a circuit, sometimes as a coprocessor or a logic block on a microprocessor, that is dedicated to the rapid movement and modification of data within that computer's memory. A blitter is capable of copying large quantities of data from one memory area to another relatively quickly, and in parallel with the CPU.
The name comes from the acronym BLIT, which stands for BLock Image Transfer. A typical use for a blitter is the movement of a large bitmap in a 2D computer game or demo.
Contents |
In early computers with raster-graphics output, the screen buffer was normally held in main memory and drawn under the control of the CPU. For many simple graphics routines, like sprite support or flood filling polygons, large amounts of memory had to be manipulated. A typical reason to move large data areas arose when drawing bitmaps, such as when drawing the next frame during a computer game or demo. An image that moves smoothly across the screen might give rise to the need for a bitmap (representing the image) to be moved every frame. Game designers had to design their game's graphics so that the total amount of bitmap data transfer required to draw each frame was within the capacity of the CPU.
As graphics hardware became more sophisticated, framebuffers grew larger with higher resolutions and colour depth. Games designers wanted to use larger images to represent game elements, and to maintain acceptable framerates. Graphics operations then required more and more data transfer speed to accomplish these bitmap moves. This work tied down the CPU, preventing it from operating on other tasks or making it infeasible to transfer large images within the finite time allowed for processing between frames.
Computer manufacturers introduced blitters to help lessen this graphics burden on the CPU, or to allow more complex graphics. Several home computers manufactured in the 1980s included a graphics coprocessor that contained a blitter. The CPU would send a description of the necessary bit blit operations to the blitter, which would then carry out the operation much faster than the CPU could, and in parallel.
The Commodore Amiga was the first personal computer to use a full-featured blitter, and the first US patent filing to use the term blitter was "Personal computer apparatus for block transfer of bit-mapped image data," assigned to Commodore-Amiga, Inc.[1] On top of the ability to copy and manipulate large areas of graphics, the hardware that contained the Amiga's blitter also included line drawing and area-filling hardware.
Later models of the Atari ST also included a blitter co-processor, which was named in all capitals as the BLITTER chip. One story states that manufacturing delays deferred its introduction into the ST line until after the first STs had shipped. Another is that the Atari ST's main competitor, the Amiga, was famous for its blitter, and so Atari introduced one as well. Although Atari planned an upgrade to allow dealers to install the blitter chip, this plan was later dropped. Instead, the BLITTER was introduced on the Mega series, and then also supported on most later machines (except the Atari TT).
Graphics-oriented software (especially games) running on systems that did not have a blitter needed to find other methods of transferring large bitmaps. Some games were written across platforms, some of which contained a blitter and some of which didn't. A typical example would be a game written for the Atari ST and Amiga. Both machines contained similar hardware, including the MC68000 processor, but the Amiga contained a blitter and the early Ataris did not. Such a game could not rely on the presence of a blitter. One approach to handling this was to load all available 68000 data registers with data from the bitmap in memory, and then push the data into the frame buffer in as few operations as possible.
Blitting was not the only solution to providing high-performance graphics in performance-limited machines. A more common solution in early machines was the use of sprites, which used two different graphics pathways to draw images that were then combined in the video display circuitry into a single image. Sprites were small bitmaps that were positioned on the screen independent of the normal bitmap background, allowing them to be moved on-screen by adjusting the values of several timers. The video circuitry started drawing the sprites after the timer had expired, allowing them to be drawn for little cost, and avoiding the need to move memory around to provide the illusion of motion. The downside of this approach is that the sprite systems generally had hard-coded limits to the number of sprites they could display, often between two (the Atari VCS) and eight (Commodore 64). Blitters offered the ability to have any number of objects, limited only by the performance of the blitter and the memory it talked to. As the performance of these circuits increased, the flexibility of the blitter overwhelmed any performance advantage in the sprite approach, and many sprite systems became blitters in disguise.
Typically, a computer program would put information into certain registers describing what memory transfer needed to be completed and the logical operations to perform on the data, then trigger the blitter to begin operating. The CPU is then free to begin some other operation while the blitter operates.
The destination for the transfer is usually the frame buffer. However, a blitter can also be used for non-graphics work. For example, an area of memory might be zeroed (filled with zeroes) using a blitter more quickly than can be accomplished with the CPU. Additionally, simple mathematical operations can be built from basic logical operations.
The image at right helps illustrate how a blitter may use a 'mask' to decide which pixels to transfer and which to leave untouched. The mask operates like a stencil, showing which pixels in the source image will be written to destination memory. The logical operation would be Dest = ((Background) AND (Mask)) OR (Sprite).
Blitters have evolved into the modern graphics processing unit. The modern GPU is essentially a very advanced blitter and shares with earlier blitters the goal of rapidly copying, transforming, and writing transformed bitmaps (more generally textures), to a framebuffer. But, while early blitters were limited to performing simple logical operations on the target data, modern GPUs add the ability to modify these bitmaps in mathematically advanced ways that help produce more interesting effects, such as shading to produce illumination effects and blending for transparency. Additionally, modern GPUs can write bitmaps to destination memory in such a way so as to provide the illusion of depth. While earlier blitters were limited to a linear one-to-one mapping of source pixels to destination pixels, modern GPUs can instead map source pixels to destination pixels in a non-linear way to produce the appearance of perspective. Support for more advanced mathematical transformations also allows the modern GPU to transform blocks of coordinates in a 3d space. This ability, combined with shading and non-linear mapping between source and destination pixels helps to create the fluid 3d experience found in many of today's games.
|