GeForce 3 Series
From Wikipedia, the free encyclopedia
The GeForce3 (codenamed NV20) was NVIDIA's third-generation GeForce chip. The range included three related chips—the GeForce 3, the GeForce 3 Ti500 and the GeForce 3 Ti200. The derived NV2A was used in the Xbox and the professional version of the GeForce 3 was the Quadro DCC.
Contents |
[edit] Programmable shaders
The first GeForce3 chips were released in March 2001, three months after NVIDIA bought out the near-defunct 3dfx. It differed from the past-generation GeForce 256/GeForce 2 in three main areas—the first was the addition of programmable vertex and pixel shaders—specialised units (required under the DirectX 8.0 specification) designed to execute small custom transform and lighting and per-pixel programs, greatly increasing the card's flexibility. (The GeForce 2 had non-programmable pixel shaders which were little used outside of tech demos.)
The second was Lightspeed Memory Architecture (LMA), which was designed to exclude overdrawn (objects obscured from view) objects from processing, conserve memory bandwidth by compressing Z-buffer (depth buffer) and also managed the memory bus better, whereas the earlier GeForce chips lacked hardware memory management. The third was the changing of anti-aliasing from supersampling to multi-sampling, which was more efficient.
[edit] Performance
In terms of performance, it sometimes lost to the GeForce2 Ultra because of a similar pixel pipeline configuration with a lower core clock speed, resulting in a lower pixel fillrate. However, the GeForce3 had the lead when anti-aliasing was enabled because of the new memory bandwidth / fillrate efficiency mechanisms, and its vertex and pixel shaders for future games enabled the GeForce3 to replace the GeForce 2 Ultra as NVIDIA's top of the line product. The GeForce3 easily outperformed the GeForce2 GTS, ATI Radeon DDR and 3dfx Voodoo 5 cards (even the 6000.)
The second revisions, the GeForce3 Ti500 and Ti200, were released in October 2001, at around the same time as ATI's top-line Radeon 8500 and 7500. The Ti500 had higher core and memory clocks (240 Core/500 Memory), and was designed to outperform the Radeon 8500; initially the Ti500 held the upper hand but eventually the 8500's driver improvements, combined with its higher fillrate, enabled it to reach and even surpass the Ti500 in some cases. In addition, the 8500 was significantly cheaper than the Ti500 and carried dual-monitor support, which the entire GeForce3 series lacked. As a result, the Ti500 was never widespread and it was replaced in spring 2002 by the GeForce 4 Ti 4200, which had slightly better performance, included dual-monitor support, and most importantly could be produced at lower cost.
The Ti200 was a cheaper chip clocked lower (175 Core/400 Memory) than the original GeForce3 (200 Core/460 Memory), and it easily surpassed the Radeon 7500 in speed and feature set outside of dual-monitor implementation. Some say that the Ti500's yields were unexpectedly poor, and the Ti200 was a way of making up the cost of making the chips, though this has never been confirmed. At half the price of the Ti500 and delivering the same features and much of the performance, the Ti200 proved popular with enthusiasts, as it could be overclocked to run at GeForce3 speeds (and Ti500 speeds in some cases). ATI rolled out the Radeon 8500LE in early 2002 to compete in the niche occupied by the Ti200.
[edit] Product positioning
The GeForce3 enjoyed undisputed performance supremacy throughout its lifetime. Unlike most NVIDIA graphics products, the GeForce3 was always aimed at the high-end and upper-midrange gaming performance market, and at no time was there a cheap, entry-level version of it—nor was there any need for one, as the numerous GeForce 2 variants were well placed to serve in the mass-market role. Even the Ti200 (which was released relatively late in the chip's life) was priced squarely at the upper end of the mid-range, and had performance to match.
The GeForce 2 and GeForce3 lines were replaced in early 2002 by the GeForce 4 MX and Ti lines, respectively. The GeForce 4 Ti was very similar to its predecessor; the main differences were higher core and memory speeds, a revised memory controller, improved vertex and pixel shaders, hardware anti-aliasing and DVD playback. Proper dual-monitor support was also brought over from the GeForce 2 MX. With the GeForce 4 Ti 4600 as the new flagship product, this was the beginning of the end of the GeForce 3 Ti 500 which was already difficult to produce due to poor yields, and it was later completely replaced by the Ti 4200.
However, the GeForce3 Ti200 was still kept in production for a short while as it occupied a spot between the (delayed) Ti 4200 and MX 460 in performance. Despite the Ti200's positioning, which would have kept the chip going until the end of 2002, it was discontinued due to naming confusion with the GeForce 4 MX and Ti lines. The discontinuing of the GeForce3 Ti200 and Radeon 8500LE disappointed many enthusiasts, because the performance-oriented Ti 4200 had not yet fallen to midrange prices, while the mass-market Radeon 9000 was not as fast as the Ti200 and 8500LE.
The original GeForce3 and the Ti500 derivative were only released in 64 MiB configurations throughout their lifetimes. This was also mostly true of the Ti200, a handful of third-parties did sell 128 MiB versions without much success since it was found that the GeForce3 got little performance gain from the extra VRAM. The Radeon 8500 series on the other hand did benefit significantly from 128 MiB, which also explained the reason why Nvidia quickly replaced the GeForce 3 with the GeForce 4 Ti.
[edit] References
- A Fallen Titan's Final Glory, Part II: The Voodoo5 6000 Reviewed at Sudhian Media
[edit] See also
[edit] External links
- NVIDIA: GeForce3 - The Infinite Effects GPU
- Anandtech: NVIDIA GeForce3
- Anandtech: NVIDIA's Fall Product Line: GeForce3 Titanium
|
|