GeForce 4 Series

From Wikipedia, the free encyclopedia

NVIDIA GeForce 4 Series
Codename NV17(MX), NV25(Ti)
Created 2002
Entry-level GPU GeForce MX Line
Mid-Range GPU GeForce Ti 4200, Ti 4400
High-end GPU GeForce Ti 4600, Ti 4800
DirectX version 8.1

A GeForce4 (codenames below) is a fourth-generation graphics processing unit (GPU) manufactured by NVIDIA which forms the basis of many computer graphics cards. Strictly speaking, the GeForce4 is the chip, not the entire card, but in common usage this distinction tends to be ignored.

There are two different GeForce4 families, the high-performance Ti family, and the budget MX family. The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002. There was a short-lived attempt to form a fourth family in late 2002, with the GeForce4 4200 Go (NV28M) derived from the Ti line.

Contents

[edit] Geforce4 cores

  • NV17 - GeForce4 MX (AGP4X)
  • NV18 - GeForce4 MX (NV17 with AGP8X)
  • NV19 - GeForce4 MX (NV18 with PCI Express via AGP-PCIe bridge)
  • NV25 - GeForce4 Ti (AGP4X)
  • NV28 - GeForce4 Ti (NV25 with AGP8X)

[edit] GeForce4 Ti

The GeForce4 Ti (NV25) was launched in April 2002 as a revision of the GeForce 3. It was very similar to its predecessor; the main differences were higher core and memory speeds, a revised memory controller, improved vertex and pixel shaders, hardware anti-aliasing and DVD playback. Proper dual-monitor support was also brought over from the GeForce 2 MX. The GeForce4 Ti outperformed the older GeForce 3 by a significant but modest margin. ATI had planned an update to the Radeon 8500 known as the 8500XT (codenamed R250) which would have been on par with the GeForce4 Ti but this was dropped in favor of the Radeon 9700 (R300).

The initial two models were the Ti4400 and the top-of-the-range Ti4600. At the time of their introduction, NVIDIA's main products were the entry-level GeForce 2 MX, the midrange GeForce4 MX models (released the same time as the Ti4400 and Ti4600), and the older but still high-performance GeForce 3 (demoted to the upper mid-range or performance niche). However, ATI's Radeon 8500LE was somewhat cheaper than the Ti4400, and outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. The GeForce 3 Ti500 filled the performance gap between the Ti200 and the Ti4400 but it could not be produced cheap enough to compete with the Radeon 8500.

In consequence, NVIDIA rolled out a slightly cheaper model: the Ti4200. Although the 4200 was initially supposed to be part of the launch of the GeForce4 line, NVIDIA had delayed its release to sell off the soon-to-be discontinued GeForce 3 chips. In an attempt to prevent the Ti4200 damaging the Ti4400's sales, NVIDIA set the Ti4200's memory speed at 222 MHz on the models with a 128 MiB frame buffer - a full 53 MHz slower than the Ti4400 (all of which had 128 MiB frame buffers). Models with a 64 MiB frame buffer were set to 250 MHz memory speed. This tactic didn't work however, for two reasons. Firstly, the Ti4400 was perceived as being not good enough for those who wanted top performance (who preferred the Ti4600), nor those who wanted good value for money (who typically chose the Ti4200), causing the Ti4400 to fade into obscurity. Furthermore, many graphics card makers simply ignored NVIDIA's guidelines for the Ti4200, and set the memory speed at 250 MHz on the 128 MiB models anyway.

NVIDIA also missed a chance to dominate the upper-range/performance segment by delaying the release of the Ti4200 and by not rolling out 128 MiB models quickly enough; otherwise the Ti4200 was cheaper and faster than the previous top-line GeForce 3 and Radeon 8500. Besides the late 2002 introduction of the Ti4200, the limited release 128 MiB models of the GeForce 3 Ti200 proved unimpressive, letting the Radeon 8500LE and even the full 8500 dominated the upper-range performance for a while.

In late 2002 the NV25 core was replaced by the NV28 core, which differed only by addition of AGP-8X support. The NV28 based Ti4200s all had their memory set at 250 MHz regardless of frame buffer size, but NVIDIA rather surprisingly chose to make a Ti4400 equivalent - the Ti4800SE. The top-end NV28 core was the Ti4800, which in spite of a name that would suggest higher performance than the Ti4600, was clocked identically.

Performance-wise, all the GeForce4 Ti chips were faster than GeForce 3 or Radeon 8500 based chips. Despite its delayed introduction the Ti4200 remained the best balance between price and performance until the launch of the Radeon 9500 Pro at the end of 2002. The Ti4200 still managed to hold its own against next generation DirectX 9 chips released in late 2003; beating out the lackluster GeForce FX 5200 and the midrange FX 5600 and performing at parity with the midrange Radeon 9600. The Ti4600 generally got beaten by the Radeon 9700, but maintained an advantage in OpenGL software.

The only mobile derivative of the Ti series was the GeForce4 4200 Go (NV28M), launched in late 2002. It featured a similar feature set and performance to the NV28-based Ti 4200 although the mobile variant was clocked slower. Although it outperformed the Mobility Radeon 9000 by a large margin, as well as being NVIDIA's first DirectX 8 laptop graphics solution, the 4200 Go had tremendous overheating problems, due in part because the Ti line had not been designed with a mobile variant in mind unlike the MX line. In contrast to the Ti 4200, the 4200 Go was a short-lived product that never caught on.

[edit] GeForce4 Ti chip table

NOTE: These are the official specifications dictated by NVIDIA; in practice the speeds tended to vary. All GeForce4 Ti chips use a 128-bit memory bus. This table is in descending order, from the slowest to the fastest.

GeForce4
Chip
Core Core
Config
Core
clock
Memory
Clock
Memory
Config
Bandwith Interface
Ti4200 NV25 4x2 250 250*/222 64/128 MiB DDR 8.0/7.1 AGP4X
Ti4200-8X NV28 4x2 250 256 128 MiB DDR 8.3 AGP8X
Ti4400 NV25 4x2 275 275 128 MiB DDR 8.8 AGP4X
Ti4800SE NV28 4x2 275 275 128 MiB DDR 8.8 AGP8X
Ti4600 NV25 4x2 300 325 128 MiB DDR 10.4 AGP4X
Ti4800 NV28 4x2 300 325 128 MiB DDR 10.4 AGP8X
  • 64 MiB Version of the Geforce 4 Ti4200 has a higher memory speed than the 128 MiB Version.

[edit] GeForce4 MX

GeForce4 MX440-SE GPU
GeForce4 MX440-SE GPU

If the capabilities of the GeForce4 family are defined by the GeForce4 Ti, then the GeForce4 MX (NV17) is a GeForce4 in name only. Many criticized the GeForce MX name as a misleading marketing ploy since it was less advanced than the preceding GeForce 3. On its release, disappointed enthusiasts described the GeForce4 MX as "GeForce 2 on steroids" - a GeForce 2 Ti with a (128-bit DDR) memory controller taken from the GF4 Ti-series.

The GeForce4 MX lacked the programmable vertex and pixel shaders of its bigger brother the GeForce4 Ti. While this did not directly impact speed, advanced Direct-X 8 rendering effects were not possible. But it also owed a good deal of its design heritage to NVIDIA's high-end CAD products, and in performance-critical non-game applications it was remarkably effective. (The most notable example is AutoCAD, in which the GeForce4 MX returned results within a single-digit percentage of Ti cards six or seven times the price.) The GeForce4 MX 440 was able to outperform the old GeForce 2 Ultra and the MX had a more efficient and cost-effective design compared to the Ultra's "brute-force" approach.

As the MX line was launched along with the rest of the GeForce4 in early 2002, Id Software technical director John Carmack worried about the GeForce4 MX's potential success. Since Carmack feared that a widespread adoption of the MX would set back the development of advanced games that used DirectX 8 vertex and pixel shaders, he warned gamers not to buy the chip. However, in mid 2004, Carmack's Doom 3 was released with support for the GeForce4 MX; it is noteworthy that the MX is the only one in the list of supported chips that does not have DirectX 8 vertex and pixel shaders. In addition, the number of advanced games has not grown as quickly as expected.

Despite harsh criticism by gaming enthusiasts, the GeForce4 MX was a market success. Priced about 30% above the GeForce 2 MX, it provided marginally better performance, the ability to play (however slowly) a number of popular games that the GeForce 2 was not compatible with and—above all else—to the average non-specialist it sounded as if it was a "real" GeForce4—i.e., a GeForce4 Ti. Although it was frequently out-performed by the older and more expensive GeForce 3, many buyers were unaware, particularly as NVIDIA was quick not to let the GeForce 3 remain on the market (even discontinuing the successful Ti200). It was particularly successful in the PC OEM market, and rapidly replaced the GeForce 2 MX as the best-selling GPU.

There were 3 initial models - the MX420, the MX440 and the MX460. The MX420 was designed for very low end PCs, the MX440 was a mass-market OEM solution, and the MX460 was a midrange solution. While the MX460 was not slow by any means, it was priced not far below the GeForce4 Ti4200, the GeForce 3 Ti200 and the Radeon 8500LE/9100 (even the full 8500 in some cases), all of which outperformed it easily as well as including DirectX 8.0 support. The end result was that the MX460 never had anywhere to go in the market, and flopped.

In terms of 3D-performance, the MX420 performed only slightly better than the GeForce 2 MX400 and below the GeForce 2 GTS, but this was never really much of a problem, considering its target audience. The nearest thing to a direct competitor the MX420 had was ATI's Radeon 7000. In practice however, its main competitors were actually chipset-integrated graphics solutions, such as Intel's 845G and NVIDIA's own nForce 2.

The MX440 performed reasonably well for its intended audience, outperforming its closest competitor, the ATI Radeon 7500, as well as the discontinued GeForce 2 Ti and Ultra. When ATI launched its Radeon 9000 Pro in September 2002, it performed about the same as the MX440, but had crucial advantages with better single-texturing performance and proper support of Vertex and Pixel shaders. However, the 9000 was unable to break the MX440's entrenched hold on the OEM market. NVIDIA's answer to the ATI Radeon 9000 was the GeForce FX 5200, but despite the 5200's DirectX 9 features it did not have the performance to match the MX440 in even current games. This kept the MX440 in production while the 5200 was discontinued, which could be considered ironic because the MX440 was supposed to be replaced by the 5200.

In motion-video applications, the GeForce4 MX did offer new functionality. The GeForce4 MX (and not the GeForce4 Ti) was the first GeForce member to feature the VPE (video processing engine.) The GeForce4 MX was the first GeForce to offer hardware-iDCT and VLC (variable length code) decoding, making VPE a major upgrade from NVIDIA’s previous HDVP. In the application of MPEG-2 playback, VPE could finally compete head-to-head with ATI's outstanding video-engine.

The GeForce4 Go was derived from the MX line and it was announced along with the rest of the GeForce4 lineup in early 2002. There was the 420 Go, 440 Go, and 460 Go. However, ATI had beaten them to the market with the Mobility Radeon 7500. (Despite its name, the short-lived 4200 Go is not part of this lineup, it was instead derived from the Ti line.)

Like the Ti series, the MX was also updated in late 2002 to support AGP-8X with the NV18 core. The two new models were the MX440-8X, which was clocked slightly faster than the original MX440, and the MX440SE, which had a narrower memory bus, and was intended as a replacement of sorts for the MX420. The MX460 was never updated; in fact, it had been discontinued several months previously. Another variant followed in late 2003 - the MX 4000, which was a GeForce4 MX440SE with a slightly higher memory clock.

Surprisingly, the GeForce4 MX line received a third update in 2004, with the PCX 4300 - an MX 4000 with support for PCI Express, and a wider memory bus. In spite of its new codename (NV19), the PCX 4300 is in fact simply an NV18 core with a chip bridging the NV18's native AGP interface with the PCI-Express bus.

[edit] GeForce4 MX chip table

NOTE: These are the official specifications dictated by NVIDIA; in practice the speeds tended to vary. Table is slowest to fastest.

GeForce4
Chip
Core Core
Config
Core
clock
Memory
Clock
Memory
Config
Bandwith Interface
MX420 NV17 2x2 250 166 64/128 MiB SDR 2.7 AGP4X
MX440 SE NV18 2x2 250 166 64/128 MiB SDR/DDR 2.7/5.3 AGP8X
MX440 NV17 2x2 270 200 64/128 MiB DDR 6.4 AGP4X
MX440 8x NV18 2x2 275 256 64/128 MiB DDR 8.2 AGP8X
MX460 NV17 2x2 300 275 128 MiB DDR 8.8 AGP4X
MX4000 NV18B 2x2 275 200 128 MiB DDR 5.3 AGP8X
PCX4300 NV19 2x2 275 200 128 MiB DDR 5.3 PCIe x16

[edit] GeForce4 Go

This family is a derivative of the GeForce4 MX family, produced for the laptop market. The GeForce4 Go family, performance wise, can be considered comparable to the MX line. However, in terms of support, some users have become rather irritated at an uncharacteristic lack of driver support from NVIDIA. Instead of supporting this family of chips, NVIDIA redirects users to the manufacturer's webpage.

One possible solution to the lack of driver support for the Go family is the third party Omega Drivers. However, it is not recommended that one install these drivers unless one is willing to accept the risks. Using third party drivers can, among other things, invalidate warranties. The Omega Drivers are supported by neither laptop manufacturers, laptop ODMs, nor by NVIDIA. NVIDIA has also recently attempted legal action against Omega Drivers. The Omega drivers are essentially stock drivers modified to deliver upto 30%-40% performance increases without overclocking. The invalidating of warranties by the expert user fanbase is usually seen as a corporate safety net rather than an actual warning against devices failing.

NVIDIA’s own solution to the problem is to try drivers from www.laptopvideo2go.com. This website hosts desktop display drivers which have been modified to install on a notebook. The drivers found on this website do not contain any laptop specific modifications and thus may or may not be better than drivers provided by your laptop's manufacturer.

[edit] Known problems

Some users have experienced problems with the card overheating due to the fact that its on-board fan has either slowed or stopped entirely due to dust. Users finding problems with 3D games crashing or even hanging momentarily should upgrade to the latest drivers and check to see that the fan is running. If it is not spinning, the graphics card should be removed, and the fan whirled manually. It should spin freely. If it does not, it may need replacement or lubrication (Dan's Data Fan Maintenance).

It should be noted that older versions of the drivers (versions 53.xx for example) for the NV18 based GeForce4 MX and GeForce4 Go supported vertex shader model 1.1 via hardware assisted software emulation, however at some point this support was dropped completely. The newer drivers report that they support vertex shader model 0.0. On certain games which are able to take advantage of vertex shading, using the older drivers can actually result in a significant performance increase. Some games that require pixel and vertex shading will not run at all on these newer drivers.

[edit] See also

[edit] Notes and references

    [edit] External links


    NVIDIA Gaming Graphics Processors
    Early Chips: NV1NV2
    Direct3D 5/6: RIVA 128RIVA TNTRIVA TNT2
    Direct3D 7.0: GeForce 256GeForce 2
    Direct3D 8.0: GeForce 3GeForce 4
    Direct3D 9.0: GeForce FXGeForce 6GeForce 7
    Direct3D 10: GeForce 8
    Other NVIDIA Technologies
    nForce: 220/415/4202SoundStorm34500600
    Professional Graphics: QuadroQuadro Plex
    Graphics Card Related: TurboCacheSLI
    Software: GelatoCg • PureVideo
    Consumer Electronics: GoForce
    Game Consoles: Xbox (NV2A)PlayStation 3 (RSX)
    In other languages