GeForce 2 Series

From Wikipedia, the free encyclopedia

NVIDIA Logo

The GeForce2 (codenamed NV15) was the second generation of GeForce graphics cards by NVIDIA Corporation. It was the successor to the GeForce 256.

Contents

[edit] GeForce2 GTS

The first model, GeForce2 GTS (also known as NV15), was named for its texels rate of 1.6 billion per second - GigaTexel Shader. Due to the addition of a second TMU (texture map unit) to each of 4 pixel-pipelines, and a higher core-clock rate (200 MHz vs 120 MHz), the GTS's texel fillrate is 3.3 times higher than that of its predecessor, the GeForce 256 (480 Mtexel/sec.) Other hardware enhancements included an upgraded video-processing pipeline, called HDVP (high definition video processor). HDVP supported motion video-playback at HDTV-resolutions (MP@HL), although playback of high-resolution video still required a powerful CPU. The GeForce2 also introduced the NVIDIA Shading Rasterizer (NSR), which was actually a primitive form of what is known as Pixel Shaders today (The GeForce 256 also had this feature, but it was never publicly announced).

In 3D benchmarks and gaming applications the GTS outperformed its predecessor (GeForce 256) by up to 40%.[1] In OpenGL games (such as Quake III), the GTS outperformed the ATI Radeon and 3DFX Voodoo 5 cards in both 16bpp and 32bpp (true-color) display modes. But in Direct3D games, the Radeon was sometimes able to take the lead in 32-bit color modes.[2]

As studies of the architecture used for GeForce 256 and GeForce2 progressed, it was determined to be extremely memory bandwidth constrained.[3] The chips wasted memory bandwidth and pixel fillrate on unoptimized z-buffer usage, drawing of hidden surfaces, and relatively inefficient RAM controllers. The main competition for these two chips, the ATI Radeon DDR, had effective optimizations (HyperZ) that combated these issues.[4] Because of memory bandwidth constraints on the GeForce chips, they could not approach their theoretical performance potential, and the Radeon, with its significantly less-endowed pipeline count, offered strong competition simply due to greater efficiency. The later NV17 revision of the NV10 design, used for the GeForce4 MX, was far more efficient in memory management; although the GeForce4 MX 460 was a 2x2 pipeline design, it outperformed the GeForce2 Ultra.

Additionally, the state of the PC gaming software at the time, and the Generally new DirectX 7 (API), likely limited the number of games able to take advantage of hardware multitexturing capabilities (the most significant difference between GeForce256 and GeForce2). Most games emphasized single-layer texturing on surfaces, which would not benefit from multitexturing hardware found on the GeForce2 or Radeon.

[edit] GeForce2 cores

There were three more revisions on the GeForce2 GTS core - the first was the GeForce2 Ultra, launched in late 2000. Architecturally identical to the GTS, the Ultra was spec'd with higher core and memory clocks. Some speculate the Ultra was intended to defeat 3dfx's Voodoo 5 6000. While later tests showed that the Ultra outperforming the Voodoo 5 6000, the Voodoo 5 6000 never reached the consumer market. Nonetheless, the Ultra put a definite lead between it and the Radeon and Voodoo 5, and even outperformed the first GeForce 3 products. The GeForce 3 initially had lower texture fillrate than GF2 Ultra, but the GeForce 3 Ti500 in late 2001 finally overtook the GeForce2 Ultra.

The other two GTS revisions were the GeForce2 Pro and the GeForce2 Ti (for "titanium"). Both parts fell between the GTS and Ultra. They were positioned as cheaper, but less advanced (feature-wise), alternatives to the high-end GeForce 3, which lacked a mass-market version. The GeForce2 Ti, which was the final incarnation of the GTS core, performed competitively with the Radeon 7500 (however the 7500 had the advantage of dual-display support). The GeForce2 Ti was released in the summer of 2001 but it did not last long, being replaced by the faster and dual-monitor capable GeForce 4 MX 440 in January 2002.

[edit] GeForce2 MX

Finally, the most successful GeForce2 part was the budget-model GeForce2 MX. The GeForce2 MX was extremely popular with OEM system builders, like its predecessor the RIVA TNT2 M64, because of its low cost and relatively competent 3D feature-set. The MX retained the GTS's core 3D architecture and feature-set, but removed two 3D pixel-pipelines and half of the GTS's memory bandwidth. NVIDIA also added true dual-display support to the MX. The GTS and subsequent non-MX models could drive a separate TV-encoder, but this second-display was always tied to the primary desktop.

ATI's competing Radeon VE (later Radeon 7000) had better dual-monitor display software but it did not offer hardware T&L. The Radeon SDR was released late, lacked multi-display support, and it was not priced competitively with the MX. In addition to being released early and achieving the best price/performance ratio, the MX and the rest of the GeForce2 line was backed by a single reliable driver unlike ATI whose products suffered from unreliable drivers at the time.

The MX performed well enough to make it a viable mainstream alternative to the GTS (and its later revisions.) Among the gamer community, the MX effectively replaced the older RIVA TNT2 cards. NVIDIA eventually split the MX product-line into "performance-oriented" (MX400) and "cost-oriented" (MX200/MX100) versions. The MX400, like the original MX, had a 128-bit (SDR) memory bus which could also be configured as 64-bit DDR. The MX200 had a 64-bit SDR memory bus, greatly limiting its gaming potential. The lowest model, MX100, was equipped with a mere 32-bit SDR memory bus.[5]

The GeForce2 MX was later used by NVIDIA as integrated graphics on its nForce line of motherboard chipsets for AMD Athlon and Duron processors. It also was used in notebooks as GeForce2 Go, a reduced-power variant using only 2.6 W peak power.[6]

[edit] Successor

The successor to the GeForce2 (non-MX) line was the GeForce3. The MX line was replaced in its market position with the GeForce4 MX. The new MX was codenamed "NV17", meaning it was generally of the same family as the GeForce2. However, GeForce4 MX gained a far superior memory controller, more efficient memory bandwidth usage, and a multi-sampling anti-aliasing unit, all from the GeForce4 Ti line.[7] As a result, the GeForce 4 MX 440 and 460 efficiently achieved performance similar or better to the GeForce 2 GTS line (even beating the "brute-force" GeForce2 Ultra), and the GeForce 4 MX also offered dual monitor support.

[edit] Models

GeForce2
Chip
Triangles per second
(Million)
Pixel Fillrate
(Megapixels)
Texel Fillrate
(Megatexels)
Memory Bandwidth
(GB/s)
Ultra 31 1000 2000 7.3
Ti 31 1000 2000 6.4
PRO 25 800 1600 6.4
GTS 25 800 1600 5.3
MX DDR 20 350 700 2.3
MX400 20 400 800 2.7
MX200 20 350 700 1.35
MX100 < 20 286 572 0.68
Go < 20 286 572 ~1.35

[edit] Competing chipsets

[edit] References

  1. ^ Lal Shimpi, Anand. NVIDIA GeForce 2 GTS, Anandtech, April 26, 2000.
  2. ^ Witheiler, Matthew. ATI Radeon 64MB DDR, Anandtech, July 17, 2000.
  3. ^ Lal Shimpi, Anand. NVIDIA GeForce2 Ultra, Anandtech, August 14, 2000.
  4. ^ Lal Shimpi, Anand. ATI Radeon 256 Preview (HyperZ), Anandtech, April 25, 2000: p.5.
  5. ^ Worobyev, Andrey. Leadtek WinFast GeForce2 MX 64 MBytes, Leadtek WinFast GeForce2 MX64 and Leadtek WinFast GeForce2 MX MAX on NVIDIA GeForce2 MX, MX200 and MX400, Digit-Life, accessed September 6, 2006.
  6. ^ Smith, Rick. NVIDIA GeForce2 Go: High performance graphics to go, Reviews OnLine, December 13, 2000.
  7. ^ Freeman, Vince. VisionTek Xtasy GeForce4 MX 440 Review, Sharky Extreme, April 19, 2002: p.3.

[edit] See also

[edit] External links


NVIDIA Gaming Graphics Processors
Early Chips: NV1NV2
DirectX 5/6: RIVA 128RIVA TNTRIVA TNT2
DirectX 7.x: GeForce 256GeForce 2
DirectX 8.x: GeForce 3GeForce 4
DirectX 9.x: GeForce FXGeForce 6GeForce 7
Direct3D 10: GeForce 8
Other NVIDIA Technologies
nForce: 220/415/420234500600SoundStorm
Professional Graphics: QuadroQuadro Plex
Graphics Card Related: TurboCacheSLI
Software: GelatoCg
Consumer Electronics: GoForce
Game Consoles: XboxPlayStation 3
In other languages