GDDR4

From Wikipedia, the free encyclopedia

GDDR4 SGRAM (Graphics Double Data Rate type four Synchronous Graphics Random-Access Memory) is a type of graphics card memory specified by the JEDEC Semiconductor Memory Standard.[1][2] It is a rival medium to Rambus's XDR DRAM. GDDR4 is based on DDR3 SDRAM technology and was intended to replace the DDR2-based GDDR3, but it ended up being replaced by GDDR5 within a year.

History

  • On October 26, 2005, Samsung announced the development of 256-Mib GDDR4 memory running at 2.5 Gbit/s. Samsung also revealed plans to sample and mass-produce GDDR4 SDRAM rated at 2.8 Gbit/s per pin.[3]
  • On February 14, 2006, Samsung announced the development of 32-bit 512-MiBit GDDR4 SDRAM capable of transferring 3.2 Gbit/s per pin, or 12.8 GB/s for the module.[4]
  • On July 5, 2006, Samsung announced the mass-production of 32-bit 512-Mibit GDDR4 SDRAM rated at 2.4 Gbit/s per pin, or 9.6 GB/s for the module. Although designed to match the performance of XDR DRAM on high-pin-count memory, it would not be able to match XDR performance on low-pin-count designs.[5]
  • On February 9, 2007, Samsung announced mass-production of 32-bit 512-Mbit GDDR4 SDRAM, rated at 2.8 Gbit/s per pin, or 11.2 GB/s per module. This module will be available for the latest version of AMD cards.[6]
  • On February 23, 2007, Samsung announced 32-bit 512-Mibit GDDR4 SDRAM rated at 4.0 Gbit/s per pin or 16 GB/s for the module and expects the memory to appear on commercially-available graphics cards by the end of year 2007.[7]

Technologies

GDDR4 SDRAM introduced DBI (Data Bus Inversion) and Multi-Preamble to reduce data transmission delay. Prefetch was increased from 4 to 8 bits. The maximum number of memory banks for GDDR4 has been increased to 8. To achieve the same bandwidth as GDDR3 SDRAM, the GDDR4 core runs at half the performance of a GDDR3 core of the same raw bandwidth. Core voltage was decreased to 1.5 V.

Data Bus Inversion adds an additional active-low DBI# pin to the address/command bus and each byte of data. If there are at least four 0 bits in the data byte, the byte is inverted and the DBI# signal transmitted low. In this way, the number of 0 bits across all 9 pins is limited to 4. This reduces power consumption and ground bounce.

On the signaling front, GDDR4 expands the chip I/O buffer to 8 bits per two cycles, allowing for greater sustained bandwidth during burst transmission, but at the expense of significantly increased CAS latency (CL), determined mainly by the double reduced count of the address/command pins and half-clocked DRAM cells, compared to GDDR3. The number of addressing pins was reduced to half that of the GDDR3 core, and were used for power and ground, which also increases latency. Another advantage of GDDR4 is power efficiency: running at 2.4 Gbit/s, it uses 45% less power when compared to GDDR3 chips running at 2.0 Gbit/s.

In Samsung's GDDR4 SDRAM datasheet, it was referred as 'GDDR4 SGRAM', or 'Graphics Double Data Rate version 4 Synchronous Graphics RAM'. However, the essential block write feature is not available, so it is not classified as SGRAM.

Adoption

The memory became available with ATI Technologies' Radeon X1950 XTX and Radeon HD 2900 XT and 2600 XT video cards. GDDR4 is intended to achieve clock rates as high as 1.4 GHz (2.8 GBit/s). However, Samsung had aimed to increase GDDR4 to effective clock rates as high as 1.6 GHz (3.2 GBit/s, at higher voltage) and was rumoured to have implemented this improvement into some of the Radeon HD 2900 XT cards [citation needed].

Graphics cards incorporating GDDR4 memory are now available for purchase with a clock rate of around 1.0 GHz to 1.1 GHz. Samsung was quoted saying they would have 1.6 GHz GDDR4 ready for market as early as July, 2006. NVIDIA was also rumored to have had plans of utilizing the memory on newer revisions of their current-generation GeForce 8-Series GPUs, but instead NVIDIA has used GDDR3 in all of the GeForce 8 cards. Adoption of GDDR4 has been minimal compared to GDDR3 which is still extremely prevalent in most mainstream Graphics cards. Some graphics vendors are adopting DDR3 instead of moving to GDDR4 or 5.[citation needed]

The video memory manufacturer Qimonda (formerly Infineon Memory Products division) has stated it will "skip" the development of GDDR4, and move directly to GDDR5.[8]

See also

References

External links

This article is issued from Wikipedia. The text is available under the Creative Commons Attribution/Share Alike; additional terms may apply for the media files.