NVIDIA
From Wikipedia, the free encyclopedia
NVIDIA Corporation | |
Type | Public (NASDAQ: NVDA) |
---|---|
Founded | 1993 |
Headquarters | Santa Clara, California, USA |
Key people | Jen-Hsun Huang, Co-Founder, President and CEO |
Industry | Semiconductors- Specialized |
Products | Graphics processing units Motherboard chipsets |
Revenue | $3.07 Billion USD (2006) |
Net income | $448.8 Million USD (2006) |
Employees | over 3,000 (2006) |
Slogan | The Way It's Meant to Be Played |
Website | www.nvidia.com old_drivers |
NVIDIA Corporation (NASDAQ: NVDA) (pronounced /ɛnˈvɪdɪə/) is an American corporation and is a major supplier of microchips used for personal computer motherboard chipsets, graphics processors (graphics processing units, GPUs), graphics cards, and media and communications devices for PCs and game consoles such as the original Xbox and the PlayStation 3. NVIDIA's most popular product lines are the GeForce series for gaming and the Quadro series for Professional Workstation Graphics processing, as well as the nForce series of chipsets. Its headquarters are located at ( ) 2701 San Tomas Expressway, Santa Clara, California.
Contents |
[edit] Company history
The name NVIDIA suggests "envy" (Spanish envidia or Latin/Italian invidia) and is designed to sound like the word video.
It was co-founded in 1993 by Jen-Hsun Huang, the present CEO, Curtis Priem, and Chris Malachowsky.
In 2000 it acquired the intellectual assets of one-time rival 3dfx, one of the biggest graphics companies of the mid to late 1990s.
On 2005-12-14, NVIDIA acquired ULI Electronics, which at the time was also supplying third-party Southbridge parts for chipsets from NVIDIA's competitor ATI. In March 2006, NVIDIA acquired Hybrid Graphics[1]and on 2007-01-05, it announced that it had completed the acquisition of PortalPlayer, Inc.[2]
[edit] Products
NVIDIA's product portfolio includes graphics processors, wireless communications processors, PC platform (motherboard core-logic) chipsets, and digital media player software. Within the Mac/PC user community, NVIDIA is best known for its "GeForce" product line, which is not only a complete line of "discrete" graphics chips found in AIB (add-in-board) video cards, but also a core-technology in both the Microsoft Xbox game-console and nForce motherboards.
In many respects, NVIDIA is similar to its competitor ATI, because both companies began with a focus in the PC market, but later expanded their businesses into chips for non-PC applications. NVIDIA does not sell graphics boards into the retail market, instead focusing on the development and manufacturing of GPU chips. As part of their operations, both ATI and NVIDIA do create "reference designs" (board schematics) and provide manufacturing samples to their board partners such as ASUS.
In December 2004, it was announced that NVIDIA would be assisting Sony with the design of the graphics processor (RSX) in the Sony PlayStation 3 game-console. As of March 2006, it is known that NVIDIA will deliver RSX to Sony as an IP-core, and that Sony alone would be responsible for manufacturing the RSX. Under the agreement, NVIDIA will provide ongoing support to port the RSX to Sony's fabs of choice (Sony and Toshiba), as well as die-shrinks to 65nm. This is a departure from NVIDIA's business arrangement with Microsoft, in which NVIDIA managed production and delivery of the Xbox GPU through NVIDIA's usual third-party foundry contracts. (Meanwhile, Microsoft has chosen ATI to provide the IP design for the Xbox 360's graphics hardware, as has Nintendo for their Wii console to supersede the ATI-based GameCube.)
- "Discrete" usually refers to the graphic chip's boundary/proximity to other PC hardware. A discrete piece of hardware can be physically plugged/unplugged from the motherboard, the opposite term being "integrated graphics" where the piece of hardware is inseparable from the motherboard. However in the PC graphics architecture context, "discrete" means graphics-hardware is encapsulated in a dedicated (separate) chip. The chip's physical location, whether soldered on the motherboard PCB (as in most laptops) or mounted on an aftermarket add-in-board, has no bearing on this designation.
[edit] Graphics chipsets
- NV1 – NVIDIA's first product based upon quadratic surfaces
- RIVA 128 and RIVA 128ZX – DirectX 5 support, OpenGL 1 support, NVIDIA's first DirectX compliant hardware
- RIVA TNT, RIVA TNT2 – DirectX 6 support, OpenGL 1 support, The series that made NVIDIA a market leader
- NVIDIA GeForce
- GeForce 256 – DirectX 7 support, OpenGL 1 support, hardware transform and lighting, introduces DDR memory support
- GeForce 2 – DirectX 7 support, OpenGL 1 support
- GeForce 3 Series – DirectX 8.0 shaders, OpenGL 1.2 support, features memory bandwidth saving architecture
- GeForce 4 Series – DirectX 8.1 parts (except for MX), OpenGL 1.4 and a new budget core (known as MX) that was based on the GeForce 2
- GeForce FX series – DirectX 9 support, OpenGL 1.5 and claimed to offer 'cinematic effects'
- GeForce 6 Series – DirectX 9.0c support, OpenGL 2.0 support, features improved shaders, reduced power consumption and Scalable Link Interface-operation
- GeForce 7 Series – DirectX 9.0c support, WDDM (Windows Display Driver Model) Support, OpenGL 2.0 support, Improved shading performance, Transparency Supersampling (TSAA) and Transparency Multisampling (TMAA) anti-aliasing, Scalable Link Interface (SLI)
- GeForce 8 Series – DirectX 9.0c, 9.0 EX and DirectX 10 support, Unified Shader Architecture consisting of Pixel, Vertex and Geometry shaders (SM 4.0), Luminex Engine features Coverage Sampled Antialiasing (CSAA), Quantum Effects Technology
- NVIDIA Quadro – High quality workstation solutions
- NVIDIA GoForce – Media processors for PDAs, Smartphones, and mobile phones featuring nPower technology
- GoForce 2150 – 1.3 Megapixel camera support, JPEG support, and 2D speed enhancement
- GoForce 3000 – A low-cost version of the GoForce 4000 with limited features
- GoForce 4000 – 3.0 Megapixel camera support and MPEG-4/H.263 codec
- GoForce 4500 – Was used in the Gizmondo, features 3D graphics support with a geometry processor and programmable pixel shaders
- GoForce 4800 – 3.0 Megapixel camera support and a 3D graphics engine
- GoForce 5500 – 10.0 Megapixel camera support, 3D graphics engine version 2, 24-bit audio engine, and H.264 support
[edit] Personal computer platforms/chipsets
- NVIDIA nForce
- nForce IGP (AMD Athlon/Duron K7 line)
- nForce2 (AMD Athlon/Duron K7 line, SPP (system platform processor) or IGP (Integrated Graphics Platform) and MCP (Media and Communications Processor), also features SoundStorm)
- nForce3 (AMD Athlon 64/Athlon 64 FX/Opteron, MCP only)
- nForce4 (AMD Athlon 64/Athlon 64 X2/Athlon 64 FX/Opteron, MCP only;Intel Pentium 4/Pentium D, SSP + MCP)
- nForce 500 (AMD Athlon 64 FX/Athlon 64 X2/Athlon 64/Sempron or Intel Core 2 Extreme/Core 2 Duo/Pentium 4/Celeron D/Pentium D)
- nForce 600 (AMD Quad FX or Intel Core 2 Quad/Core 2 Extreme/Core 2 Duo/Pentium 4/Celeron D/Pentium D)
- Xbox GeForce3-class GPU (on an Intel Pentium III/Celeron platform)
- PlayStation 3 (RSX 'Reality Synthesizer')
[edit] Market history
[edit] Pre-DirectX
NVIDIA's original graphics card called the NV1 was released in 1995, based upon quadratic surfaces, with an integrated playback only soundcard and ports for Sega Saturn gamepads. Because the Saturn was also based upon forward-rendered quadratics, several Saturn games were ported to a PC with NV1, such as Panzer Dragoon and Virtua Fighter Remix. However, the NV1 struggled in a market place full of several competing proprietary standards.
Market interest in the product ended when Microsoft announced the DirectX specifications, based upon polygons. Subsequently NV1 development continued internally as the NV2 project, funded by several millions of dollars of investment from Sega. Sega hoped that an integrated sound and graphics chip would cut the manufacturing cost of their next console. However, even Sega eventually realized that quadratic surfaces were a flawed implementation, and the NV2 was never fully developed.
[edit] A fresh start
NVIDIA's CEO Jen-Hsun Huang realized at this point after two failed products, something had to change if the company was to survive. He hired David Kirk, Ph.D. as Chief Scientist from software developer Crystal Dynamics, a company renowned for the visual quality of its titles. David Kirk turned NVIDIA around by combining the company's 3D hardware experience, with an intimate understanding of practical implementations of rendering.
As part of the corporate transformation, NVIDIA abandoned proprietary interfaces, sought to fully support DirectX, and dropped multimedia functionality, in order to reduce manufacturing costs. NVIDIA also adopted an internal 6 month product cycle goal. The future failure of any one product would not threaten the survival of the company, since a next generation replacement part would always be available.
However, since the Sega NV2 contract was secret, and employees had been laid off, it appeared to many industry observers that NVIDIA was no longer active in research and development. So when the RIVA 128 was first announced in 1997, the specifications were hard to believe: Performance superior to market leader 3dfx Voodoo Graphics, and a full hardware triangle setup engine. The RIVA 128 shipped in volume, and the combination of its low cost and high performance 2D/3D acceleration made it a popular choice for OEMs.
[edit] Ascendency: RIVA TNT
Having finally developed and shipped in volume the market leading integrated graphics chipset, NVIDIA set the internal goal of doubling the number of pixel pipelines in its chip, in order to realize a substantial performance gain. The TwiN Texel (RIVA TNT) engine NVIDIA subsequently developed, allowed either for two textures to be applied to a single pixel, or for two pixels to be processed per clock cycle. The former case allowing for improved visual quality, the latter doubling maximum fill rate.
New features included a 24-bit Z-buffer with 8-bit stencil support, anisotropic filtering, and per-pixel MIP mapping. In certain respects such as transistor count, the TNT had begun to rival Intel's Pentium processors for complexity. However, while the TNT offered an astonishing range of quality integrated features, it failed to displace the market leader Voodoo 2, because the actual clock speed ended up at only 90 MHz, about 35% less than expected.
NVIDIA responded with a refresh part that was a die shrink for the TNT architecture from 350 nm to 250 nm. Stock TNTs now ran at 125 MHz, ULTRAs at 150 MHz. Though it was beaten to the market by the Voodoo 3, 3dfx's offering proved dissapointing as it was not much faster and lacked features that were becoming standard, such as 32-bit color and textures of resolution greater than 256 x 256.
The RIVA TNT2 marked a major turning point for NVIDIA. They had finally delivered a product competitive with the fastest on the market, with a superior feature set, strong 2D functionality, all integrated onto a single die with strong yields, that ramped to impressive clock speeds. NVIDIA's six month cycle refresh took the competition by surprise, giving it the initiative in rolling out new products.
[edit] Market Leadership: GeForce
The autumn of 1999 saw the release of the GeForce 256 (NV10), most notably bringing on-board transformation and lighting. It ran at 120 MHz and was also implemented with advanced video acceleration, motion compensation, hardware sub picture alpha-blending, and had four-pixel pipelines. The GeForce outperformed existing products by a wide margin, such as the ATI Rage 128, 3dfx Voodoo 3, Matrox G400 MAX, and RIVA TNT2.
Due to the success of its products, NVIDIA won the contract to develop the graphics hardware for Microsoft’s Xbox game console, which earned NVIDIA a large $200 million advance. However, the project drew the time of many of NVIDIA's best engineers. In the short term, this was of no importance, and the GeForce 2 GTS shipped in the summer of 2000.
The GTS benefited from the fact that NVIDIA had by this time acquired extensive manufacturing experience with their highly integrated cores, and as a result they were able to optimize the core for clock speeds. The volumes of chips NVIDIA was producing also enabled them to bin split parts, picking out the highest quality cores for their premium range. As a result, the GTS shipped at 200 MHz. The pixel fill rate of the GeForce256 nearly doubled, and texel fill rate nearly quadrupled because multi-texturing was added to each pixel pipeline. New features included S3TC compression, FSAA, and improved MPEG-2 motion compensation.
More significantly, shortly afterwards NVIDIA launched the GeForce 2 MX, intended for the budget/OEM market. It had two pixel pipelines fewer, and ran at 175 and later, 200 MHz. Offering strong performance at a midrange price, the GeForce 2MX is one of the most successful graphics chipset of all time. A mobile derivative called the GeForce2 Go was also shipped at the end of 2000.
NVIDIA's success proved too much for 3dfx to recover its past market share. The long delayed Voodoo 5, which was the successor to the Voodoo 3, did not compare favourably with the GeForce 2 in both price and performance, and failed to generate the sales needed to keep the company afloat. With 3dfx on the verge on bankruptcy near the end of 2000, NVIDIA purchased them primarily for the intellectual property, which was in dispute at the time[1], but also acquired anti-aliasing expertise, and about 100 engineers.
NVIDIA developed the GeForce 3 which pioneering DirectX 8 vertex and pixel shaders, and then refined it with the GeForce 4 Ti line. The GeForce 2 was succeeded by the GeForce 4 MX. The GeForce 4 Ti, MX, and Go were all announced in January 2002, one of the largest releases in NVIDIA history, though it must be noted that, cleverly, the chips within each series differed only by chip and memory clockspeeds.
[edit] Shortcomings of the FX series
At this point NVIDIA’s market position looked unassailable, and industry observers began to refer to NVIDIA as the Intel of the graphics industry. However, their major remaining rival ATI Technologies did stay competitive due to their Radeon which was mostly on par with the GeForce 2 GTS. Though their answer to the GeForce 3, the Radeon 8500, was later and initially plagued by driver issues, the 8500 proved a superior competitor due to its lower price and greater potential. NVIDIA countered ATI's offering with the GeForce 4 Ti line, though the Ti 4200's delayed rollout enabled the 8500 to carve out a niche. ATI opted to work on their next generation Radeon 9700 rather than a direct competitor to the GeForce 4 Ti.
While the next generation GeForce FX chips were being developed, many of NVIDIA’s best engineers were working on the Xbox contract, developing a motherboard solution, including the API used as part of the SoundStorm platform. NVIDIA engineers were also contractually obligated to develop newer NV2A chips that were more difficult to hack, which further shortchanged the FX project. The Xbox contract did not allow for falling manufacturing costs, as process technology improved, and Microsoft sought to renegotiate the terms of the contract, withholding the DirectX 9 specifications as leverage. As a result, NVIDIA and Microsoft relations, which had previously been very good, deteriorated. Both parties later settled the dispute through arbitration and the terms were not released to the public. However, the dispute was what prompted NVIDIA to pass over developing a graphics solution for the succeeding Xbox 360, with ATI taking on that contract, while NVIDIA decided to work on the PlayStation 3 instead.
Due to the Xbox dispute, NVIDIA was not consulted when the DirectX 9 specification was drawn up, while ATI designed the Radeon 9700 to fit the DirectX specifications. Rendering color support was limited to 24 bits floating point, and shader performance had been emphasized throughout development, since this was to be the main focus of DirectX 9. The Shader compiler was also built using the Radeon 9700 as the base card.
In contrast, NVIDIA’s cards offered 16 and 32 bit floating point modes, offering either lower visual quality (as compared to the competition), or slow performance. The 32 bit support made them much more expensive to manufacture requiring a higher transistor count. Shader performance was often only half or less the speed provided by ATI's competing products. Having made its reputation by providing easy to manufacture DirectX compatible parts, NVIDIA had misjudged Microsoft’s next standard, and was to pay a heavy price for this error. As more and more games started to rely on DirectX 9 features, the poor shader performance of the GeForce FX series became ever more obvious. With the exception of the FX 5700 series (a late revision), the FX series lacked performance compared to equivalent ATI parts.
NVIDIA started to become ever more desperate to hide the shortcomings of the GeForce FX range. A notable 'FX only' demo called Dawn was released, but the wrapper was hacked to enable it to run on a 9700, where it ran faster despite a perceived translation overhead. NVIDIA also began to include ‘optimizations’ in their drivers to increase performance. While some users contended that increased real world gaming performance were valid, hardware review sites started to run articles showing how NVIDIA’s driver autodetected benchmarks, and produced artificially inflated scores that did not relate to real world performance. Often it was tips from ATI’s driver development team that lay behind these articles. As NVIDIA’s drivers became ever more full of hacks and ‘optimizations,' the legendary stability and compatibility also began to suffer. While NVIDIA did partially close the gap with new instruction reordering capabilities introduced in later drivers, shader performance remained weak, and over-sensitive to hardware specific code compilation. NVIDIA worked with Microsoft to release an updated DirectX compiler, that generated GeForce FX specific optimized code.
Furthermore, the GeForce FX series also ran hot, because they drew as much as double the amount of power as equivalent parts from ATI. The GeForce FX 5800 Ultra became notorious for the fan noise, and acquired the nicknames ‘dustbuster’ and 'leafblower'. While it was quietly withdrawn and replaced with the quieter 5900, the FX chips still needed large and expensive fans, placing NVIDIA's partners at a manufacturing cost disadvantage compared to ATI. As a result of Microsoft's actions, and the resultant FX series' weaknesses, NVIDIA quite unexpectedly lost its market leadership position to ATI.
With the GeForce 6 series, NVIDIA had clearly moved beyond the DX9 performance problems that plagued the previous generation. The GeForce 6 series not only performed competitively where Direct 3D shaders were concerned, but also supported DirectX Shader Model 3.0, while ATI's competing X800 series chips only supported the previous 2.0 specification. This proved to be an insignificant advantage, mainly because games of that period didn't employ extensions for SM3.0. But, it demonstrated NVIDIA's desire to design and follow through with the newest features and deliver them in a specific timeframe. What became more apparent during this time was that the two vendors, ATI and NVIDIA, were on a very level playing field in terms of performance. The two companies traded blows in specific titles and specific criteria - resolution, image quality, anisotropic filtering/anti-ailiasing- but differences were becoming more abstract and the reigning concern became price-to-perfomance. The mid-range offerings of the two vendors demonstrated the consumers' appetite for affordable, high-performance graphics cards, and it is now this price segment in which much of the vendors' profitability is determined.
The GeForce 7 series was a heavily beefed-up extension of the reliable 6-series. The industry's introduction of the PCI Express bus standard allowed NVIDIA to release "SLI", a solution that employs two similar cards to share the workload in rendering. While these solutions do not equate to double the performance, and require more electricity (two cards vs. one), they can make a huge difference as higher resolutions and settings are enabled and, more importantly, offer more upgrade flexibility. ATI responded with the X1000 series, and their own dual-rendering solution called "Crossfire". Sony chose NVIDIA to develop the "RSX" chip used in the Playstation 3- a modified version of the 7800 GPU.
NVIDIA released the 8800 series chip at the end of 2006, making them the first to support Microsoft's ambitious Direct3D 10 specification.
[edit] Current market share
According to a survey[3] conducted by Jon Peddie Research, a leading market watch firm, concerning the state of the graphics market in Q2 2006 show that, while NVIDIA's market share in graphics chips overall remained at 3rd place at 20.30%, Intel is the dominant force in the discrete graphic card solution with a market share of about 51.5%.
[edit] Windows Vista driver issues
Due to the beta nature of the latest drivers for NVIDIA, as of the official retail release of Windows Vista, there have been many reports of issues with NVIDIA graphics cards running under the new operating system. Some of these include blue screen of death instances occurring with the most recent drivers released on the NVIDIA website.[4] This has been responded to by frustrated users who allegedly made rude and derogatory posts in the NVIDIA forums, although these users state they were merely asking for a positive response from NVidia[2] The possibility of a lawsuit is being contemplated against NVIDIA for what has been perceived by some as false advertising. The main issue is with regards to the advertising tactics of NVIDIA — on their home page, they have advertised "NVIDIA: Essential for the best Windows Vista experience". New WHQL drivers for Windows Vista 32-bit have been released on February 20th 2007, but there are still some flaws with the software and some features including SLI support for Direct X 10 are not yet supported and are announced for mid-late Q1 2007.
[edit] Documentation and drivers
NVIDIA does not provide the documentation for their hardware, which is necessary in order for programmers to write appropriate and effective open source drivers for NVIDIA's products. Instead, NVIDIA provides their own binary GeForce graphics drivers for X.Org and a thin open-source library that interfaces with the Linux, FreeBSD or Solaris kernels and the proprietary graphics software. NVIDIA's Linux support has promoted mutual adoption in the entertainment, scientific visualization, defense and simulation/training industries, which have been traditionally dominated by SGI, Evans & Sutherland and other relatively costly vendors.
Because of the proprietary nature of NVIDIA's drivers, they are at the center of an ongoing controversy within the Linux and FreeBSD communities. Many Linux and FreeBSD users insist on using only open-source drivers, and regard a binary-only driver as wholly inadequate.[5] However, there are also users that are content with the NVIDIA-supported drivers.
X.Org Foundation and Freedesktop.org started the Nouveau project aiming to develop free software drivers for NVIDIA graphics cards, by reverse engineering NVIDIA's current proprietary drivers for Linux.
[edit] Video card manufacturers
NVIDIA does not manufacture video cards, only the GPU chips, though they also set the chip and memory speed/configurations for third-parties to follow. The cards are assembled by OEMs under one of the following brand names:
- Albatron
- AOpen
- ASUS
- BFG (also under its 3D Fuzion brand)
- BIG
- Biostar
- Chaintech
- Club 3D
- Creative
- ELSA
- eVGA
- Gainward
- Galaxy
- Gigabyte
- Inno3D
- Leadtek
- Mad Dog Multimedia
- Micro-Star International (MSI)
- OCZ
- Palit
- POV
- PNY
- XFX
- Zebronics
- Zogis
[edit] See also
- ATI Technologies
- Comparison of ATI Graphics Processing Units
- Comparison of NVIDIA Graphics Processing Units
- Matrox
- NVIDIA Demos
[edit] References
- ^ The Register Hardware news: Nvidia acquires Hybrid Graphics
- ^ Press Release: NVIDIA acquires PortalPlayer, dated January 5, 2007.
- ^ http://www.xbitlabs.com/news/video/display/20060731234259.html
- ^ http://forums.techarena.in/showthread.php?t=672401
- ^ Linux Weekly News 14 Aug 2006: X.org, distributors, and proprietary modules
[edit] External links
- NVIDIA.com – Corporate Site
- NVIDIA Store - Shop Now
- nZone.com - NVIDIA's Gaming Community Site
- SLIZone.com - NVIDIA's SLI Technology Community Site
- NVIDIA.com Driver download area
- Drivers and modified INFs for Notebooks (cross reference from NVIDIA Support staff)
- Tweakguides.com "NVIDIA Forceware Tweak Guide"
- NVIDIA overclocking guide
- Firing Squad: History of NVIDIA pre 1999
- Omega drivers, alternative drivers
- Installing NVIDIA's graphics driver on Debian GNU/Linux
|
|
Hardware companies: Acer - Apple - Alcatel-Lucent - AMD - ASUS - Cisco - Dell - Freescale - Fujitsu Siemens Computers - Infineon - Intel - Juniper - Lenovo - LG - Matsushita - Motorola - NEC - Nokia - Nortel Networks - NVIDIA - NXP - Philips - Qimonda - Qualcomm - Samsung - Sony - STMicroelectronics - Texas Instruments - Toshiba -VIA
Software companies: Adobe - CA - Oracle - Red Hat - SAP
Hardware/software companies: Apple - EMC - Fujitsu - Hitachi - HP - IBM - Microsoft - NetApp - Siemens - Sun - Thomson
Dot-com Companies: Amazon.com - AOL - eBay - Google - Yahoo!
Technology Consulting companies: Accenture - Atos Origin - Bearing Point - Capgemini - Cognizant - CSC - EDS - HCL Technologies - Infosys - LogicaCMG - Satyam - TCS - Wipro