Radeon R100
From Wikipedia, the free encyclopedia
Radeon R100 | |
---|---|
Radeon 7000 Series | |
Codename(s) | Rage 6C |
Created in year | 2000 |
Entry-level cards | 7000/VE, SDR, LE |
Mid-range cards | 32 DDR, 7200 |
High-end cards | 64 DDR VIVO (SE), 7500 |
Direct3D support | 7.0 |
Radeon R100-based chipsets | |
---|---|
CPU Supported | Mobile Athlon XP (320M IGP) Mobile Duron (320M IGP) Pentium 4-M and mobile Pentium 4 (340M IGP, 7000 IGP) |
Socket supported | Socket A, Socket 563 (AMD) Socket 478 (Intel) |
Desktop/Mobile chipsets | |
Performance segment | 7000 IGP |
Mainstream segment | 320 IGP, 320M IGP 340 IGP, 340M IGP |
Value segment | 320 IGP, 320M IGP (AMD) 340 IGP, 340M IGP (Intel) |
Miscellaneous | |
Release date(s) | March 13, 2002 (300/300M IGP) March 13, 2003 (7000 IGP) |
Successor | Radeon 9000/9100 series IGP |
The Radeon R100 is the first generation of Radeon graphics chips from ATI Technologies. The line features 3D acceleration based upon Direct3D 7.0 and OpenGL 1.x, a major improvement in features and performance compared to the preceding Rage design. The processors also include 2D GUI acceleration, video acceleration, and multiple display outputs. "R100" refers to the development codename of the initially released GPU of the generation. It is the basis for a variety of other succeeding products.
Contents |
[edit] Development
[edit] Architecture
The first-generation Radeon GPU was launched in 2000, and was initially code-named Rage 6, (later "R100"), as the successor to ATI's aging Rage 128 which was unable to compete with the GeForce 256.
The Radeon was comparable in specification to the nVidia GeForce2. The Radeon and GeForce differed with regard to their pixel pipeline configuration. The GeForce2 line had a maximum throughput of 4 pixels written to the frame buffer per clock cycle, and could sample two different texture maps per pixel (a so-called "4x2" configuration). The Radeon had a pixel processing throughput of two pixels per clock (having two pixel pipelines), but could sample from three separate texture maps (having three Texture Mapping Units) in a single clock (a "2x3" configuration), which was initially ATI's definition of a Radeon; a graphics processor which, among other things (DirectX 7 T&L, etc), had 3 TMUs. Unfortunately, the third texture unit did not get much use in games during the card's lifetime. The ATI Radeon also introduced a new technology called HyperZ, which improved the efficiency of the method used to remove obscured objects from the rendering pipeline, generally adding an effective 30% more bandwidth.
As is often done in the video card industry, ATI produced a real-time demo for their new card, to showcase its new features. The "Radeon's Ark" demo presented a science-fiction environment with heavy use of features such as multiple texture layers for image effects and detail. Amongst the many impressive effects were environment-mapped bump mapping, detail textures, glass reflections, mirrors, realistic water simulation, light maps, texture compression, planar reflective surfaces, and portal-based visibility.[1] Radeon was capable of these features courtesy of its programmable pipelines and third texture mapping unit.
In terms of texel and pixel throughput, the Radeon scored lower than the GeForce2 in most benchmarks, even with HyperZ activated. The performance difference was especially noticeable in 16-bit color, where both the GeForce2 and 3dfx's Voodoo 5 5500 were far ahead. However, the Radeon could close the gap and even overtake the GeForce2 in 32-bit color, which became the standard for games onward from 1999. Although the Radeon was plagued by poor drivers, it was considered a better well-rounded card with superior DVD-playback and environment-mapped bump mapping, features that the gamer-oriented GeForce series lacked.
Aside from the new 3D-core, the Radeon introduced per-pixel video-deinterlacing to ATI's already leading-edge HDTV-capable MPEG-2 engine. In motion-video applications ranging from AVI to DVD playback, the Radeon was considered by many to be in a class by itself. (But due to the immaturity of device drivers and the DirectX-VA software API, the deinterlacing was only used by one application: Ravisent Cinemaster DVD.)
[edit] R100's pixel shaders
R100-based GPUs have programmable shading capability in their pipelines; however, the chips are not flexible enough to support the Microsoft Direct3D specification for Pixel Shader 1.1. A forum post by an ATI engineer in 2001 clarified this:
“ ...prior to the final release of DirectX 8.0, Microsoft decided that it was better to expose the RADEON's and GeForce{2}'s extended multitexture capabilities via the extensions to SetTextureStageState() instead of via the pixel shader interface. There are various practical technical reasons for this. Much of the same math that can be done with pixel shaders can be done via SetTextureStageState(), especially with the enhancements to SetTextureStageState() in DirectX 8.0. At the end of the day, this means that DirectX 8.0 exposes 99% of what the RADEON can do in its pixel pipe without adding the complexity of a "0.5" pixel shader interface. Additionally, you have to understand that the phrase "shader" is an incredibly ambiguous graphics term. Basically, we hardware manufacturers started using the word "shader" a lot once we were able to do per-pixel dot products (i.e. the RADEON / GF generation of chips). Even earlier than that, "ATI_shader_op" was our multitexture OpenGL extension on Rage 128 (which was replaced by the multivendor EXT_texture_env_combine extension). Quake III has ".shader" files it uses to describe how materials are lit. These are just a few examples of the use of the word shader in the game industry (nevermind the movie production industry which uses many different types of shaders, including those used by Pixar's RenderMan). With the final release of DirectX 8.0, the term "shader" has become more crystallized in that it is actually used in the interface that developers use to write their programs rather than just general "industry lingo." In DirectX 8.0, there are two versions of pixel shaders: 1.0 and 1.1. (Future releases of DirectX will have 2.0 shaders, 3.0 shaders and so on.) Because of what I stated earlier, RADEON doesn't support either of the pixel shader versions in DirectX 8.0. Some of you have tweaked the registry and gotten the driver to export a 1.0 pixel shader version number to 3DMark2001. This causes 3DMark2001 to think it can run certain tests. Surely, we shouldn't crash when you do this, but you are forcing the (leaked and/or unsupported) driver down a path it isn't intended to ever go. The chip doesn't support 1.0 or 1.1 pixel shaders, therefore you won't see correct rendering even if we don't crash. The fact that that registry key exists indicates that we did some experiments in the driver, not that we are half way done implementing pixel shaders on RADEON. DirectX 8.0's 1.0 and 1.1 pixel shaders are not supported by RADEON and never will be. The silicon just can't do what is required to support 1.0 or 1.1 shaders. This is also true of GeForce and GeForce2.”
[edit] Implementations
[edit] R100
The first versions of the Radeon (R100) were the Radeon DDR, available in Spring 2000 with 32 MiB or 64MiB configurations; the 64 MiB card had a slightly faster clock speed and added VIVO (video-in/video-out) capability. Initially called the Radeon 256, the chip was changed before release in order to avoid any confusion with the GeForce 256; the new Radeon cards were meant to surpass them, not compete with them. The Radeon 32 DDR and Radeon 64 DDR VIVO were produced until summer 2001, when they were essentially replaced by the Radeon 7500.
The slower and short-lived Radeon SDR (with 32 MiB SDR memory) was added in summer 2000 to compete with the GeForce2 MX.
In 2000, around the release of the Radeon SDR, another model was the OEM-only Radeon LE, a "crippled" card made by the third-party company Athlon Micro, using Radeon chips that did not meet specifications. It was however almost a full blown Radeon (R100) 32 MB DDR, albeit with reduced clock frequency and HyperZ disabled at the software level; these handicaps could be overcome through overclocking and enabling HyperZ in the Windows registry. Tweaking the Radeon LE allowed it to closely match the performance of ATI's Radeon 32 MB DDR and thus the LE was a bargain at half the price of the original. Sometimes the HyperZ unit could be enabled, but often it was not perfectly functional which would show up in applications as strange artifacts and distortions (flickering geometry was one symptom). Later drivers did not differentiate the Radeon LE from other Radeon R100 cards and the HyperZ units were enabled by default.
In 2001, after the release of the Radeon 8500, the original Radeon model (R100) was renamed as the Radeon 7200. The 7200 was first used to market the short-lived (R100) 64MB SDR card that was released in early summer 2001, while the original Radeon 32 DDR and Radeon 64 DDR VIVO labels remained unchanged until the cards were discontinued. Radeon 7200 later applied retroactively to include all R100 chips regardless of memory configuration.
[edit] RV100
Other models of the first-generation Radeon were the Radeon VE (RV100), later known as the Radeon 7000, which was a cost-reduced model. The VE lacked much of the original Radeon's 3D-hardware: one pixel-pipeline, Hardware T&L, and HyperZ were all removed(but some current versions by 3rd parties have added the feature such as Maddog's), as well as a 64-bit memory bus, although it added "Hydravision" dual-monitor support and integrated a 2nd RAMDAC into the core (for "Hydravision").
The VE did not fare well against the T&L capable GeForce2 MX of the same era, although the VE had superior multi-display support and came out ahead in DirectX and 32-bit color support. The Radeon VE was later used as the follow-up for the Mobility Radeon which proved very successful in the laptop market.
[edit] RV200
The final model was the Radeon 7500 (a.k.a. RV200), which was based on a 0.15 micrometer (150 nm) manufacturing process (R(V)100 used a 0.18 micrometer process) and clocked considerably higher than the "R100". It was basically a "R100" with a die shrink and higher clocks, running 290 MHz core and 230 MHz RAM. One of the tweaks to the chip was to allow asynchronous clock operation, whereas the original "R100" was always clocked synchronously with its RAM. It was also ATI's first T&L chip that included dual-monitor support (Hydravision). Launched in fall 2001 alongside the Radeon 8500, the 7500 was intended to compete with the GeForce2 Ti in the way that the 8500 was supposed to compete with the GeForce 3 Ti500, although some presumed that the 7500 was up against the GeForce 3 Ti200.
When nVidia launched the GeForce4 family in early 2002, the Radeon 7500's performance was inferior to nVidia's similarly-priced GeForce4 MX440. This led ATI to release its successor, the Radeon 9000. However, the Mobility Radeon 7500 was highly successful as a laptop graphics solution since it easily outperformed the GeForce2 Go and it took a while before the GeForce4 Go (laptop GeForce4 MX) was released.
[edit] Models
[edit] Competing chipsets
[edit] See also
[edit] References
- "ATI Radeon 256 Preview" by Anand Lal Shimpi, Anandtech. Com, April 25, 2000, retrieved January 17, 2006
- "ATI Radeon 32MB SDR" by Anand Lal Shimpi, Anandtech. Com, October 13, 2000, retrieved January 17, 2006
- "ATI Radeon 64MB DDR" by Matthew Witheiler, Anandtech. Com, July 17, 2000, retrieved January 17, 2006
- "Beyond3D 3D Tables" Beyond3D.Com, retrieved January 17, 2006
- Vlachos, Alex. Radeon's Ark demo, 2000.
|