Hubbry Logo
GeForce 8 seriesGeForce 8 seriesMain
Open search
GeForce 8 series
Community hub
GeForce 8 series
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
GeForce 8 series
GeForce 8 series
from Wikipedia

GeForce 8 series
An Nvidia GeForce 8800 Ultra released in 2007, the series' flagship model
Release dateNovember 8, 2006; 19 years ago (November 8, 2006)
CodenameG8x
ArchitectureTesla
Models
  • GeForce GS series
  • GeForce GT series
  • GeForce GTS series
  • GeForce GTX series
  • GeForce Ultra series
Cards
Entry-level8100
8200
8300
8400
8500
Mid-range8600 GS/GT/GTS
High-end8800 GS/GT/GTS
Enthusiast8800 GTX/Ultra
API support
Direct3DDirect3D 10.0
Shader Model 4.0
OpenGLOpenGL 3.3
History
PredecessorGeForce 7 series
SuccessorGeForce 9 series
Support status
Unsupported

The GeForce 8 series is the eighth generation of Nvidia's GeForce line of graphics processing units. The third major GPU architecture developed by Nvidia, Tesla represents the company's first unified shader architecture.[1][2]

Overview

[edit]

All GeForce 8 Series products are based on Tesla.

As with many GPUs, it is important to note that the larger numbers these cards carry does not guarantee superior performance over previous generation cards with a lower number. For example, the GeForce 8300 and 8400 entry-level cards cannot be compared to the previous GeForce 7200 and 7300 cards due to their inferior performance. The same can be said for the high-end GeForce 8800 GTX card, which cannot be compared to the previous GeForce 7800 GTX card due to differences in performance.

Max resolution

[edit]

Dual dual-link DVI support: Able to drive two flat-panel displays up to 2560×1600 resolution. Available on select GeForce 8800 and 8600 GPUs.

One dual-link DVI support: Able to drive one flat-panel display up to 2560×1600 resolution. Available on select GeForce 8500 GPUs and GeForce 8400 GS cards based on the G98.

One single-link DVI support: Able to drive one flat-panel display up to 1920×1200 resolution. Available on select GeForce 8400 GPUs.[3] GeForce 8400 GS cards based on the G86 only support single-link DVI.

Display capabilities

[edit]

The GeForce 8 series supports 10-bit per channel display output, up from 8-bit on previous Nvidia cards. This potentially allows higher fidelity color representation and separation on capable displays. The GeForce 8 series, like its recent predecessors, also supports Scalable Link Interface (SLI) for multiple installed cards to act as one via an SLI Bridge, so long as they are of similar architecture.

NVIDIA's PureVideo HD video rendering technology is an improved version of the original PureVideo introduced with GeForce 6. It now includes GPU-based hardware acceleration for decoding HD movie formats, post-processing of HD video for enhanced images, and optional High-bandwidth Digital Content Protection (HDCP) support at the card level.[4]

GeForce 8300 and 8400 series

[edit]
NVidia GeForce 8400 GS "Rev 1.0"
NVidia GeForce 8400 GS "Rev 3.0"

In the summer of 2007 Nvidia released the entry-level GeForce 8300 GS and 8400 GS graphics cards, based on the G86 core. The GeForce 8300 was only available in the OEM market, and was also available in integrated motherboard GPU form as the GeForce 8300 mGPU. The GeForce 8300 series was only available in PCI Express, with the GeForce 8400 series using either PCI Express or PCI. The first version of the 8400 GS is sometimes called "GeForce 8400 GS Rev. 1".

Being entry-level cards, it is usually less powerful than with mid-range and high-end cards. Because of the reduced graphics performance of these cards, it is not suitable for intense 3D applications such as fast, high-resolution video games, however they could still play most games in lower resolutions and settings, making these cards (in particular the 8400 series) popular among casual gamers and HTPC (Media Center) builders without a PCI Express or AGP slot on the motherboard.

The GeForce 8300 and 8400 series were originally designed to replace the low-cost GeForce 7200 series and entry-level GeForce 7300 series, however they were not able to do so due to their aforementioned inferior gaming performance.

At the end of 2007 Nvidia released a new GeForce 8400 GS based on the G98 (D8M) chip.[5] It is quite different from the G86 used for the "first" 8400 GS, as the G98 features VC-1 and MPEG2 video decoding completely in hardware, lower power consumption, reduced 3D-performance and a smaller fabrication process. The G98 also features dual-link DVI support and PCI Express 2.0. G86 and G98 cards were both sold as "8400 GS", the difference showing only in the technical specifications. This card is sometimes referred to as "GeForce 8400 GS Rev. 2".

During mid-2010 Nvidia released another revision of the GeForce 8400 GS based on the GT218 chip.[6] It has a larger amount of RAM, a significantly reduced 3D-performance, and is capable of DirectX 10.1, OpenGL 3.3 and Shader 4.1. This card is also known as "GeForce 8400 GS Rev. 3".

GeForce 8500 and 8600 series

[edit]

On April 17, 2007, Nvidia released the GeForce 8500 GT for the entry-level market, and the GeForce 8600 GT and 8600 GTS for the mid-range market. The GeForce 8600 GS was also available. They are based on the G84 core. This series came in PCI Express configurations, with some cards in PCI.

With the 8600 series being mid-range cards, they provided more power than entry-level cards such as the 8400 and 8500 series but are not as powerful as with the high-end cards such as the 8800 series. They provided adequate performance in most games with decent resolutions and settings but may struggle with handling some higher-resolution video games.

Nvidia introduced 2nd-generation PureVideo with this series. As the first major update to PureVideo since the GeForce 6's launch, 2nd-gen PureVideo offered much improved hardware-decoding for H.264.

GeForce 8800 series

[edit]
EVGA GeForce 8800 GTX
Underside

The 8800 series, codenamed G80, was launched on November 8, 2006, with the release of the GeForce 8800 GTX and GTS for the high-end market. A 320 MB GTS was released on February 12 and the Ultra was released on May 2, 2007. The cards are larger than their predecessors, with the 8800 GTX measuring 10.6 in (~26.9 cm) in length and the 8800 GTS measuring 9 in (~23 cm). Both cards have two dual-link DVI connectors and an HDTV/S-Video out connector. The 8800 GTX requires 2 PCIe power inputs to keep within the PCIe standard, while the GTS requires just one.

8800 GS

[edit]

The 8800 GS is a trimmed-down 8800 GT with 96 stream processors and either 384 or 768 MB of RAM on a 192-bit bus.[7] In May 2008, it was rebranded as the 9600 GSO in an attempt to spur sales.

The early 2008 iMac models featured an 8800 GS GPU that is actually a modified version of the 8800M GTS (which is a laptop-specific GPU normally found in high-end laptops) with a slightly higher clock speed, rebranded as an 8800 GS.[8] These newly updated models with the rebranded 8800 GS GPUs were announced by Apple on April 28, 2008.[9] It uses 512 MB of GDDR3 video memory clocked at 800 MHz, 64 unified stream processors, a 500 MHz core speed, a 256-bit memory bus width, and a 1250 MHz shader clock. These specifications are highly similar to that of the 8800M GTS, of which the iMac's 8800 GS GPU is based on.[10]

8800 GTX / 8800 Ultra

[edit]
NVIDIA NVIO-1-A3 RAMDAC

The 8800 GTX is equipped with 768 MB GDDR3 RAM. The 8800 series replaced the GeForce 7900 series as Nvidia's top-performing consumer GPU. GeForce 8800 GTX and GTS use identical GPU cores, but the GTS model disables parts of the GPU and reduces RAM size and bus width to lower production cost.

At the time, the G80 was the largest commercial GPU ever constructed. It consists of 681 million transistors covering a 480 mm2 die surface area built on a 90 nm process. (In fact the G80's total transistor count is ~686 million, but since the chip was made on a 90 nm process and due to process limitations and yield feasibility, Nvidia had to break the main design into two chips: Main shader core at 681 million transistors and NV I/O core of about ~5 million transistors making the entire G80 design standing at ~686 million transistors).

A minor manufacturing defect related to a resistor of improper value caused a recall of the 8800 GTX models just two days before the product launch, though the launch itself was unaffected.[11]

The GeForce 8800 GTX was by far the fastest GPU when first released, and 13 months after its initial debut it still remained one of the fastest. The GTX has 128 stream processors clocked at 1.35 GHz, a core clock of 575 MHz, and 768 MB of 384-bit GDDR3 memory at 1.8 GHz, giving it a memory bandwidth of 86.4 GB/s. The card performs faster than a single Radeon HD 2900 XT, and faster than 2 Radeon X1950 XTXs in Crossfire or 2 GeForce 7900 GTXs in SLI[citation needed]. The 8800 GTX also supports HDCP, but one major flaw is its older NVIDIA PureVideo processor that uses more CPU resources. Originally retailing for around US$600, prices came down to under US$400 before it was discontinued. The 8800 GTX was also very power hungry for its time, demanding up to 155 watts of power and requiring two 6-pin PCI-E power connectors to operate.[12] The 8800 GTX also has 2 SLI connector ports, allowing it to support NVIDIA 3-way SLI for users who run demanding games at extreme resolutions such as 2560x1600.

3-way GeForce 8800 Ultra in an SLI using a rigid bridging connector

The 8800 Ultra, retailing at a higher price than the 8800 GTX, is identical to the GTX architecturally, but features higher clocked shaders, core and memory. Nvidia told the media in May 2007 that the 8800 Ultra was a new stepping,[clarification needed] creating less heat[clarification needed] therefore clocking higher. Originally retailing from $829,[13] most users[who?] thought the card to be a poor value, offering only 10% more performance than the GTX but costing hundreds of dollars more. Prices dropped to as low as $200 before being discontinued on January 23, 2008. The core clock of the Ultra runs at 612 MHz, the shaders at 1.5 GHz, and finally the memory at 2.16 GHz, giving the Ultra a theoretical memory bandwidth of 103.7 GB/s. It has 2 SLI connector ports, allowing it to support Nvidia 3-way SLI. An updated dual slot cooler was also implemented, allowing for quieter and cooler operation at higher clock speeds.[14]

8800 GT

[edit]

The 8800 GT, codenamed G92, was released on October 29, 2007. This card is the first to transition to the 65 nm process, and supports PCI-Express 2.0.[15] It has a single-slot cooler as opposed to the dual-slot cooler on the 8800 GTS and GTX, and uses less power than GTS and GTX due to its aforementioned 65 nm process. While its core processing power is comparable to that of the GTX, the 256-bit memory interface and the 512 MB of GDDR3 memory often hinders its performance at very high resolutions and graphics settings. The 8800 GT, unlike other 8800 cards, is equipped with the PureVideo HD VP2 engine for GPU assisted decoding of the H.264 and VC-1 codecs.

The release of this card presents an odd dynamic to the graphics processing industry. With an initial projected street price at around $300, this card outperforms ATI's flagship HD2900XT in most situations, and even NVIDIA's own 8800 GTS 640 MB (previously priced at an MSRP of $400). The card, while only marginally slower in synthetic and gaming benchmarks than the 8800 GTX, also takes much of the value away from Nvidia's own high-end card.

Performance benchmarks at stock speeds place it above the 8800 GTS (640 MB and 320 MB versions) and slightly below the 8800 GTX. A 256 MB version of the 8800 GT with lower stock memory speeds (1.4 GHz as opposed to 1.8 GHz) but with the same core is also available. Performance benchmarks have shown that the 256 MB version of the 8800 GT has a considerable performance disadvantage when compared to its 512 MB counterpart, especially in newer games such as Crysis. Some manufacturers also make models with 1 GB of memory; and with large resolutions and big textures, one can perceive a significant performance difference in the benchmarks. These models are more likely to take up to 2 slots of the computer due to its usage of dual-slot coolers instead of a single-slot cooler on other models.

The performance (at the time) and popularity of this card is demonstrated by the fact that even as late as 2014, the 8800 GT was often listed as the minimum requirement for modern games developed for much more powerful hardware.

8800 GTS

[edit]
PNY 8800GTS 640MB

The first releases of the 8800 GTS line, in November 2006, came in 640 MB and 320 MB configurations of GDDR3 RAM and utilized Nvidia's G80 GPU.[16] While the 8800 GTX has 128 stream processors and a 384-bit memory bus, these versions of 8800 GTS feature 96 stream processors and a 320-bit bus. With respect to features, however, they are identical because they use the same GPU.[17]

Around the same release date as the 8800 GT, Nvidia released a new 640 MB version of the 8800 GTS. While still based on the 90 nm G80 core, this version has 7 out of the 8 clusters of 16 stream processors enabled (as opposed to 6 out 8 on the older GTSs), giving it a total of 112 stream processors instead of 96. Most other aspects of the card remain unchanged. However, because the only 2 add-in partners producing this card (BFG and EVGA) decided to overclock it, this version of the 8800 GTS actually ran slightly faster than a stock GTX in most scenarios, especially at higher resolutions, due to the increased clock speeds.[18]

Nvidia released a new 8800 GTS 512 MB based on the 65 nm G92 GPU on December 10, 2007.[19] This 8800 GTS has 128 stream processors, compared to the 96 processors of the original GTS models. It is equipped with 512 MB GDDR3 on a 256-bit bus. Combined with a 650 MHz core clock and architectural enhancements, this gives the card raw GPU performance exceeding that of 8800 GTX, but it is constrained by the narrower 256-bit memory bus. Its performance can match the 8800 GTX in some situations, and it outperforms the older GTS cards in all situations.

Compatibility issues with PCI Express 1.0a on GeForce 8800 GT/8800 GTS 512 MB cards

[edit]

Shortly after their release, an incompatibility issue with older PCI Express 1.0a motherboards surfaced. When using the PCI Express 2.0 compliant 8800 GT or 8800 GTS 512 in some motherboards with PCI Express 1.0a slots, the card would not produce any display image, but the computer would often boot (with the fan on the video card spinning at a constant 100%). The incompatibility has been confirmed on motherboards with VIA PT880Pro/Ultra,[20] Intel 925[21] and Intel 5000P[22] PCI Express 1.0a chipsets.

Some graphics cards had a workaround that involves re-flashing the graphics card's BIOS with an older Gen 1 BIOS, however this effectively made it into a PCI Express 1.0 card, which is unable to utilize PCI Express 2.0 functions. This could be considered a non-issue however, since the card itself could not even utilize the full capacity of the regular PCI Express 1.0 slots, there was no noticeable reduction in performance. Also, flashing the video card's BIOS usually voided the warranties of most video card manufacturers (if not all), thus making it a less-than-optimum way of getting the card to work properly. A proper workaround to this is to flash the BIOS of the motherboard to the latest version, which depending on the manufacturer of the motherboard, may contain a fix.

In relation to this, the high numbers of cards reported as DOA (as much as 13–15%) were believed to be inaccurate. When it was revealed that the G92 8800 GT and 8800 GTS 512 MB were going to be designed with PCI Express 2.0 connections, NVIDIA claimed that all cards would have full backwards compatibility, but failed to mention that this was only true for PCI Express 1.1 motherboards. The source of the BIOS-flash workaround did not come from NVIDIA or any of their partners, but rather ASRock, a mainboard producer, who mentioned the fix in one of their motherboard FAQs. ASUSTek sells the 8800 GT with their sticker, posted a newer version of their 8800 GT BIOS on their website, but did not mention that it fixed this issue. EVGA also posted a new bios to fix this issue.[23]

Technical summary

[edit]
Model Launch Codename Fab

(nm)

Transistors

(million)

Die size

(mm2)

Bus interface Core config1 Clock rate Fillrate Memory Processing

power

(GFLOPS)3

TDP(Watts) Comments
Core

(MHz)

Shader

(MHz)

Memory

(MHz)

Pixel

(GP/s)

Texture

(GT/s)

Size (MB) Bandwidth

(GB/s)

Bus

type

Bus

width (bit)

Singleprecision
GeForce 8100 mGPU[26] 2008 MCP78 80 Unknown Unknown PCIe 2.0 ×16 8:8:4 500 1200 400
(system

RAM)

2 4 Up to 512 of

system RAM

6.4
12.8
DDR2 64
128
28.8 Unknown The block of decoding of

HD-video PureVideo HD is disconnected

GeForce 8200 mGPU[26] 2008 MCP78 80 Unknown Unknown 8:8:4 500 1200 400
(system

RAM)

2 4 gt 6.4
12.8
DDR2 64
128
28.8 Unknown PureVideo 3 with VP3
GeForce 8300 mGPU[26] 2008 MCP78 80 Unknown Unknown 8:8:4 500 1500 400
(system

RAM)

2 4 Up to 512 of

system RAM

6.4
12.8
DDR2 64
128
36 Unknown PureVideo 3 with VP3
GeForce 8300 GS[27] July 2007 G86 80 210 127 PCIe 1.0 ×16 8:8:4 450 900 400 1.8 3.6 128
512
6.4 DDR2 64 14.4 40 OEM only
GeForce 8400 GS June 15, 2007 G86 80 210 127 PCIe 1.0 ×16
PCI
16:8:4 450 900 400 1.8 3.6 128
256
512
6.4 DDR2 64 28.8 40
GeForce 8400 GS rev.2 December 10, 2007 G98 65 210 86 PCIe 2.0 ×16
PCIe ×1
PCI
8:8:4 567 1400 400 2.268 4.536 128
256
512
6.4 DDR2 64 22.4 25
GeForce 8400 GS rev.3 April 26, 2009 GT218 40 260 57 PCIe 2.0 ×16 8:4:4 520
589
1230 600 2.08
2.356
2.08
2.356
512
1024
4.8
9.6
DDR3 32
64
19.7 25
GeForce 8500 GT April 17, 2007 G86 80 210 127 PCIe 1.0 ×16
PCI
16:8:4 450 900 400 1.8 3.6 256
512
1024
12.8 DDR2 128 28.8 45
GeForce 8600 GS April 2007 G84 80 289 169 PCIe 1.0 ×16 16:8:8 540 1180 400 4.32 4.32 256
512
12.8 DDR2 128 75.5 47 OEM only
GeForce 8600 GT April 17, 2007 G84 80 289 169 PCIe 1.0 ×16
PCI
32:16:8 540 1188 400
700
4.32 8.64 256
512
1024
12.8
22.4
DDR2
GDDR3
128 76 47
GeForce 8600 GTS April 17, 2007 G84 80 289 169 PCIe 1.0 ×16 32:16:8 675 1450 1000 5.4 10.8 256
512
32 GDDR3 128 92.8 71
GeForce 8800 GS January 2008 G92 65 754 324 PCIe 2.0 ×16 96:48:12 550 1375 800 6.6 26.4 384
768
38.4 GDDR3 192 264 105
GeForce 8800 GTS (G80) February 12, 2007 (320)
November 8, 2006 (640)
G80 90 681 484 PCIe 1.0 ×16 96:24:20 513 1188 800 10.3 24.6 320
640
64 GDDR3 320 228 146
GeForce 8800 GTS 112 (G80) November 19, 2007 G80 90 681 484 112:282:20 500 1200 800 10 24 640 64 GDDR3 320 268.8 150 only XFX, EVGA and BFG models,

very short-lived[28]

GeForce 8800 GT October 29, 2007 (512)
December 11, 2007 (256, 1024)
G92 65 754 324 PCIe 2.0 ×16 112:56:16 600 1500 700 (256)
900 (512, 1024)
9.6 33.6 256
512
1024
57.6 GDDR3 256 336 125
GeForce 8800 GTS (G92) December 11, 2007 G92 65 754 324 128:64:16 650 1625 970 10.4 41.6 512 62.1 GDDR3 256 416 135
GeForce 8800 GTX November 8, 2006 G80 90 681 484 PCIe 1.0 ×16 128:322:24 575 1350 900 13.8 36.8 768 86.4 GDDR3 384 345.6 145
GeForce 8800 Ultra May 2, 2007 G80 90 681 484 128:322:24 612 1500 1080 14.7 39.2 768 103.7 GDDR3 384 384 175
Model Launch Codename Fab

(nm)

Transistors

(million)

Die size

(mm2)

Bus interface Core config1 Core

(MHz)

Shader

(MHz)

Memory

(MHz)

Pixel

(GP/s)

Texture

(GT/s)

Size (MB) Bandwidth

(GB/s)

Bus

type

Bus

width (bit)

Singleprecision TDP

(Watts)

Comments
Clock rate Fillrate Memory Processing

power

(GFLOPS)3

Features

[edit]
  • Compute Capability 1.1: has support for Atomic functions, which are used to write thread-safe programs.
  • Compute Capability 1.2: for details see CUDA
Model Features
Scalable
Link
Interface
(SLI)
3-Way
SLI
PureVideo HD
with VP1
PureVideo 2 with VP2,

BSP Engine, and AES128 Engine

PureVideo 3 with VP3,

BSP Engine, and AES128 Engine

PureVideo 4 with VP4 Compute
ability
GeForce 8300 GS (G86) No No No Yes No No 1.1
GeForce 8400 GS Rev. 2 (G98) No No No No Yes No 1.1
GeForce 8400 GS Rev. 3 (GT218) No No No No No Yes 1.2
GeForce 8500 GT Yes No No Yes No No 1.1
GeForce 8600 GT Yes No No Yes No No 1.1
GeForce 8600 GTS Yes No No Yes No No 1.1
GeForce 8800 GS (G92) Yes No No Yes No No 1.1
GeForce 8800 GTS (G80) Yes No Yes No No No 1.0
GeForce 8800 GTS Rev. 2 (G80) Yes No Yes No No No 1.0
GeForce 8800 GT (G92) Yes No No Yes No No 1.1
GeForce 8800 GTS (G92) Yes No No Yes No No 1.1
GeForce 8800 GTX Yes Yes Yes No No No 1.0
GeForce 8800 Ultra Yes Yes Yes No No No 1.0

GeForce 8M series

[edit]

On May 10, 2007, Nvidia announced the availability of their GeForce 8 notebook GPUs through select OEMs. The lineup consists of the 8200M, 8400M, 8600M, 8700M and 8800M series chips.[29]

It was announced by Nvidia that some of their graphics chips have a higher than expected rate of failure due to overheating when used in particular notebook configurations. Some major laptop manufacturers made adjustments to fan setting and firmware updates to help delay the occurrence of any potential GPU failure. In late July 2008, Dell released a set of BIOS updates that made the laptop fans spin more frequently.[30] As of mid-August 2008, Nvidia has not published any further details publicly, though it has been heavily rumored that most, if not all, of the 8400 and 8600 cards had this issue.[31]

GeForce 8200M series

[edit]

The GeForce 8200M is an entry-level series of GeForce 8M GPUs. It can be found in some entry-level to mid-range laptops as an alternative to integrated graphics. The GeForce 8200M G is the only GPU in this series.

Its GPU core was based on the GeForce 9200M/9300M GS GPUs. This series was not designed for gaming, but rather for viewing high-definition video content. It can still play older games just fine, but may struggle to play with then-current games at low settings.[32]

Some HP Pavilion, Compaq Presario, and Asus laptops have GeForce 8200M G GPUs.

GeForce 8400M series

[edit]

The GeForce 8400M is the entry-level series for the GeForce 8M chipset. Normally found on mid-range laptops as an alternative solution to integrated graphics, the 8400M was designed for watching high-definition video content rather than gaming.

Versions include the 8400M G, 8400M GS, and 8400M GT. These were not designed for gaming (it was only meant for non-gaming tasks such as high-definition video content as mentioned above), however the GDDR3-equipped 8400M GT can handle most games of its time at medium settings and was suitable for occasional gaming.[33] On the other hand, the rest of the 8400M series aside from the 8400M GT handled older games quite well but can only run then-current games at low settings.

Some ASUS and Acer laptops featured 8400M G GPUs. Some Acer Aspire models, some HP Pavilion dv2000, dv6000, dv9000 models, some Dell Vostro 1500 and 1700 models, the Dell XPS M1330, and some Sony VAIO models featured 8400M GS GPUs. Various Acer Aspire and Sony VAIO laptop models featured 8400M GT GPUs.

GeForce 8600M series

[edit]

The GeForce 8600M was offered in mid-range laptops as a mid-range performance solution for enthusiasts who want to watch high-definition content such as Blu-ray Disc and HD DVD movies and play then-current and some future games with decent settings.

Versions include the 8600M GS and 8600M GT (with the GT being the more powerful one). They provided decent gaming performance (due to the implementation of GDDR3 memory in the higher-end 8600M models) for then-current games.

It is available on the Dell XPS M1530 portable, some Dell Inspiron 1720 models, HP Pavilion dv9000 models, Asus G1S, Sony VAIO VGN-FZ21Z, select Lenovo IdeaPad models, some models of the Acer Aspire 5920, Acer Aspire 9920G and BenQ Joybook S41, the Mid 2007 to Late 2008 MacBook Pro, and some models of Fujitsu Siemens.

The common failure of this chip in, amongst others, MacBook Pro's purchased between May 2007 and September 2008 were part of a class-action suit against nVidia which resulted in Apple providing an extended 4 year warranty related to the issue[34] after confirming that the issue was caused by the Nvidia chip themselves.[35][36] This warranty replacement service was expected to cost nVidia around $150 to $200 million[37] and knocked over $3 billion off their market capitalisation after being sued by their own shareholders for attempting to cover the issue up.[38]

GeForce 8700M series

[edit]

The GeForce 8700M was developed for the mid-range market. The 8700M GT is the only GPU in this series.

This chipset is available on high-end laptops such as the Dell XPS M1730, Sager NP5793, and Toshiba Satellite X205.

While this card is considered by most in the field to be a decent mid-range card, it is hard to classify the 8700M GT as a high-end card due to its 128-bit memory bus, and is essentially an overclocked 8600M GT GDDR3 mid-range card.[39] However, it shows strong performance when in a dual-card SLI configuration, and provides decent gaming performance in a single-card configuration.[40]

GeForce 8800M series

[edit]

The GeForce 8800M was developed to succeed the 8700M in the high-end market, and can be found in high-end gaming notebook computers.

Versions include the 8800M GTS and 8800M GTX. These were released as the first truly high-end mobile GeForce 8 Series GPUs, each with a 256-bit memory bus and a standard 512 megabytes of GDDR3 memory, and provide high-end gaming performance equivalent to many desktop GPUs. In SLI, these can produce 3DMark06 results in the high thousands.[40]

Laptop models which include the 8800M GPUs are: Sager NP5793, Sager NP9262, Alienware m15x and m17x, HP HDX9000, and Dell XPS M1730. Clevo also manufactures similar laptop models for CyberPower, Rock, and Sager (among others) - all with the 8800M GTX, while including the 8800M GTS in the Gateway P-6831 FX and P-6860 FX models.

The 8800M GTS was used in modified form as the GeForce 8800 GS in the early 2008 iMac models.

Technical summary

[edit]
Model Release Date Codename Fabrication process (nm) Core clock max (MHz) Peak fillrate Shaders Memory Power Consumption (Watts) Transistor Count (Millions) Theoretical Shader Processing Rate (Gigaflops)
billion pixel/s billion bilinear texel/s billion bilinear FP16 texel/s Stream Processors Clock (MHz) Bandwidth max (GB/s) DRAM type Bus width (bit) Megabytes Effective DDR Clock (MHz)
GeForce 8200M G June 2008 MCP77MV MCP79MV 80 350/500 3 ? ? 8 1200 ? DDR2 64 256 ? ? ? 19
GeForce 8400M G May 10, 2007 G86M 80 400 3.2 3.2 1.6 8 800 6.4 GDDR3 64 128/256 1200 15 210 19.2
GeForce 8400M GS May 10, 2007 G86M 80 400 3.2 3.2 1.6 16 800 6.4 GDDR2/GDDR3 64 64/128/256 1200 15 210 38.4
GeForce 8400M GT May 10, 2007 G86M 80 450 3.6 3.6 1.8 16 900 19.2 GDDR3 128 128/256/512 1200 17 210 43.2
GeForce 8600M GS May 10, 2007 G84M 80 600 4.8 4.8 2.4 16 1200 12.8/22.4 DDR2/GDDR3 128 128/256/512 800/1400 19 210 57.6
GeForce 8600M GT May 10, 2007 G84M 80 475 3.8 7.6 3.8 32 950 12.8/22.4 DDR2/GDDR3 128 128/256/512 800/1400 22 289 91.2
GeForce 8700M GT June 12, 2007 G84M 80 625 5.0 10.0 5.0 32 1250 25.6 GDDR3 128 256/512 1600 29 289 120.0
GeForce 8800M GTS[41] November 19, 2007 G92M 65 500 8.0 16.0 8.0 64 1250 51.2 GDDR3 256 512 1600 35 754 240.0
GeForce 8800M GTX[42] November 19, 2007 G92M 65 500 12.0 24.0 12.0 96 1250 51.2 GDDR3 256 512 1600 37 754 360.0
  • The series has been succeeded by GeForce 9 series (which in turn was succeeded by the GeForce 200 series). The GeForce 8400M GS was the only exception, which has not been renamed in neither the GeForce 9 and GeForce 200 series.

Problems

[edit]

Some chips of the GeForce 8 series (concretely those from the G84 [for example, G84-600-A2] and G86 series) suffer from an overheating problem. Nvidia states this issue should not affect many chips,[43] whereas others assert that all of the chips in these series are potentially affected.[43] Nvidia CEO Jen-Hsun Huang and CFO Marvin Burkett were involved in a lawsuit filed on September 9, 2008, alleging their knowledge of the flaw, and their intent to hide it.[44]

Support

[edit]

Nvidia has ceased Windows driver support for GeForce 8 series on April 1, 2016.[45]

  • Windows XP 32-bit & Media Center Edition: version 340.52 released on July 29, 2014; Download
  • Windows XP 64-bit: version 340.52 released on July 29, 2014; Download
  • Windows Vista, 7, 8, 8.1 32-bit: version 342.01 (WHQL) released on December 14, 2016; Download
  • Windows Vista, 7, 8, 8.1 64-bit: version 342.01 (WHQL) released on December 14, 2016; Download
  • Windows 10, 32-bit: version 342.01 (WHQL) released on December 14, 2016; Download
  • Windows 10, 64-bit: version 342.01 (WHQL) released on December 14, 2016; Download

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The GeForce 8 series is the eighth generation of NVIDIA's GeForce graphics processing units (GPUs), based on the Tesla microarchitecture and launched on November 8, 2006, with the high-end GeForce 8800 GTX as its flagship model. This series marked a pivotal shift in GPU design by introducing the industry's first unified shader architecture, which combined vertex, pixel, and later compute shaders into a single, flexible processing model to optimize workload distribution and boost performance in complex rendering scenarios. Built on a 90 nm manufacturing process using the G80 graphics processor, the GeForce 8 series was NVIDIA's initial implementation of DirectX 10 support, enabling advanced features like geometry shaders and improved tessellation for more realistic 3D graphics in games and applications. Key models in the desktop lineup included the premium GeForce 8800 GTX and 8800 Ultra with 768 MB of GDDR3 memory and 384-bit memory interfaces, mid-range options like the GeForce 8800 GTS (320 MB or 640 MB variants) and 8800 GT (512 MB), and value-oriented cards such as the GeForce 8600 GT/GTS, 8500 GT, and 8400 GS. The series also extended to mobile GPUs under the GeForce 8M series, featuring models like the 8600M GT and 8400M GS for laptops, which adapted the unified architecture for power-efficient performance in portable devices. Notable technological advancements included NVIDIA's HD technology for hardware-accelerated decoding with and de-interlacing, as well as support for (SLI) multi-GPU configurations to enhance frame rates in demanding titles. The GeForce 8 series played a crucial role in the transition to next-generation gaming, powering early 10 titles and laying the groundwork for GPU-accelerated computing through , which debuted alongside it to enable general-purpose processing beyond graphics. Despite its high power consumption—a TDP of 155 W for the 8800 GTX requiring auxiliary power connectors—it set performance benchmarks that influenced subsequent architectures and solidified NVIDIA's leadership in the discrete GPU market until the succeeded it in 2008.

Introduction

Development and Release

The GeForce 8 series marked NVIDIA's transition from the , driven by the need to support and compete with AMD's forthcoming R600 GPU, while preparing for the launch of in January 2007. This shift emphasized a new unified to handle the advanced rendering requirements of DirectX 10, positioning NVIDIA to lead in high-performance gaming ahead of AMD's entry into the market. NVIDIA announced the G80-based GeForce 8 series at CES 2006, highlighting its 10 capabilities through early demonstrations. The initial release came on November 8, 2006, with the high-end GeForce 8800 GTX, followed shortly by the GeForce 8800 GTS in December 2006. Subsequent models expanded the lineup, including the mid-range GeForce 8600 and 8500 series in April 2007, and the entry-level GeForce 8400 and 8300 series in April 2007. These cards were positioned across market segments for desktop gaming: the 8800 series as premium high-end options for enthusiasts, the 8600 and 8500 series for mainstream users, and the 8400 and 8300 series for entry-level builds. Production of the GeForce 8 series wound down around 2008, with the final models like the 8800 GS released in early 2008, giving the overall series a lifespan from 2006 to 2008.

Architectural Overview

The GeForce 8 series introduced NVIDIA's Tesla microarchitecture, a major redesign that unified the graphics processing pipeline by replacing the distinct fixed-function vertex and pixel shader units of prior generations with a single, programmable shader type capable of handling vertex, geometry, and pixel operations interchangeably. This unified shader model allows for dynamic load balancing, where processing resources are allocated based on the demands of the current rendering stage, significantly improving utilization and enabling support for emerging standards like geometry shaders. At the heart of the architecture is the streaming multiprocessor (SM), a parallel processing unit; the flagship G80 core, powering high-end models like the GeForce 8800 GTX, incorporates 16 SMs with 8 shader processors each, yielding 128 unified shaders in total. Fabricated initially on TSMC's , the Tesla dies—G80 for high-end, G84 for mid-range, and G86 for entry-level—exemplify a highly scalable design that permits cost-effective variants through partial disabling of processing elements. The G80 die spans 484 mm² with 681 million transistors, while the smaller G84 (169 mm², 289 million transistors) and G86 (127 mm², 210 million transistors) followed with an 80 nm shrink to enhance yields and efficiency without altering the core architectural principles. This scalability allowed to span performance tiers while maintaining architectural consistency across the series. Connectivity is handled via 1.1 support with up to 16 lanes, providing up to 8 GB/s of bidirectional bandwidth for data transfer between the GPU and system memory; the is backward compatible with PCI Express 1.0a, though some later models had compatibility issues with certain older motherboards. The rendering pipeline integrates dedicated hardware for key stages, including a transform engine for vertex processing and management, texture units for sampling and filtering, and render output processors (ROPs) for blending and . In the G80, for instance, this includes 32 units and 24 ROPs to support high-resolution rendering and . The also achieves full 10 compliance, facilitating advanced feature sets like instanced and higher shader precision.

Desktop Graphics Cards

GeForce 8300 and 8400 Series

The GeForce 8300 and 8400 series comprised NVIDIA's entry-level desktop graphics processing units within the GeForce 8 lineup, optimized for low-power consumption and basic multimedia tasks such as home theater PC (HTPC) setups and light office productivity. Built on the 80 nm G86 graphics core, these cards emphasized affordability and compatibility with single-slot motherboard designs, drawing under 50 watts to enable fanless or passive cooling configurations in compact systems. The 8300 GS, released in , utilized the G86 core with 128-256 MB of DDR2 memory across a 64-bit bus, targeting users needing reliable video decoding and simple 2D/3D acceleration without demanding gaming capabilities. Its single-slot form factor and low thermal output made it ideal for HTPCs and office environments where space and noise were concerns. In comparison, the 8400 GS shared the same G86 core but offered enhanced variants with up to 512 MB of GDDR3 memory for improved bandwidth in video playback scenarios. It included options for silent operation and support for NVIDIA's Hybrid SLI technology, which allowed pairing with compatible integrated motherboard GPUs to boost overall graphics performance in multi-GPU configurations. Desktop-focused models like the standard 8400 GS were common for similar light-duty applications. These cards provided modest performance gains over the prior , delivering approximately 20-30% better frame rates in 9-based games at low resolutions (e.g., 1024x768) compared to equivalents like the GeForce 7300 GS, based on aggregate benchmark data. Unified architecture enabled basic 10 compatibility for future-proofing entry-level setups. At launch, street prices ranged from $50 to $80 USD, positioning them as accessible upgrades for integrated graphics users.

GeForce 8500 and 8600 Series

The GeForce 8500 GT and GeForce 8600 series represented NVIDIA's mid-range offerings in the GeForce 8 lineup, targeting mainstream gamers seeking 10 support and improved multimedia capabilities without the premium cost of flagship models. Launched on April 17, 2007, these GPUs balanced performance for resolutions up to 1024x768 in gaming scenarios, leveraging the Tesla architecture's for enhanced efficiency in both graphics and compute tasks. The GeForce 8500 GT utilized the G86 graphics core, fabricated on a 80 nm process, featuring 256 MB of GDDR3 memory on a 128-bit bus and 16 unified shading units (often referred to in contemporary contexts as equivalent to 12 pixel pipelines in legacy terms). With a core clock of 450 MHz and memory clock of 400 MHz (effective 800 MHz), it was designed primarily for entry-to-mid-level gaming at 1024x768 resolution, delivering playable frame rates in contemporary titles like World of Warcraft and The Elder Scrolls IV: Oblivion. In contrast, the 8600 GS and GT models employed the more capable G84 core, also on 80 nm, supporting up to 512 MB GDDR3 memory on a 128-bit interface and 32 unified units for superior rasterization and texturing performance. The 8600 GT variant, clocked at 540 MHz core and 700 MHz memory clock (1400 MHz effective), included SLI support for dual-card configurations, enabling enthusiasts to scale performance in compatible games and applications. The 8600 GS, a lower-shader-count with 16 unified units, offered a budget-friendly alternative with similar architecture but reduced capabilities. These cards typically featured dual-slot cooling solutions on GT models to manage thermal loads during extended sessions. A standout feature across the 8500 and 8600 series was the HD VP2 video processor, which provided full for H.264 decoding, offloading 100% of the workload from the CPU for smooth playback of high-definition content like Blu-ray and . The VP2 also supported and decoding with advanced de-interlacing, making these GPUs ideal for media center PCs. In benchmarks, the 8600 GT demonstrated competitive performance against AMD's X1950 series in 9 titles, outperforming the X1950 Pro by an average of 28% across aggregate tests in games such as and at 1024x768 with . However, in early 10 previews like beta demos, the 8600 GT lagged behind higher-end cards due to its mid-range shader count and memory bandwidth, achieving 20-30% lower frame rates compared to the 8800 series. At launch, pricing positioned these GPUs as accessible mid-range options, with the GeForce 8500 GT at $89-129 USD, the 8600 GT at $149-159 USD, and the 8600 GS slightly below the GT, appealing to value-conscious consumers in 2007.

GeForce 8800 Series

The GeForce 8800 series represented NVIDIA's flagship desktop processing units within the GeForce 8 lineup, targeting high-end gaming and professional applications with support for 10 and advanced multi-GPU configurations. Introduced as the pinnacle of the series, these GPUs utilized the G80 and later G92 cores, emphasizing unified for enhanced performance in complex rendering tasks. The lineup began with the high-performance 8800 GTX and expanded to include more accessible variants, all featuring SLI connectivity to enable dual-GPU setups capable of driving resolutions up to 2560x1600 for immersive visuals. The 8800 GTX, based on the G80 core fabricated on a , launched on November 8, 2006, at a manufacturer-suggested retail price of $599 USD, marking it as NVIDIA's first 10-compliant desktop GPU. It featured 128 unified stream processors, a 575 MHz core clock, and 768 MB of GDDR3 memory on a 384-bit interface, delivering 86.4 GB/s bandwidth for demanding workloads like high-definition video playback and shader-intensive games. This model set performance benchmarks for its era, with SLI configurations providing scalable power for enthusiasts seeking maximum frame rates at elevated settings. Following the GTX, the 8800 GTS variants offered cut-down configurations of the G80 core to broaden market appeal. The initial 320 MB model, with 96 stream processors and 12 ROPs, released on December 6, 2006, at around $349 USD, while a 640 MB edition followed shortly after at $449 USD; a revised 320 MB version launched on February 12, 2007, for $269 USD. Later, the 512 MB 8800 GTS, shifting to the 65 nm G92 core with 128 stream processors and 12 ROPs, debuted on December 11, 2007, priced at $349 USD, providing a cost-effective refresh with improved efficiency over the original G80-based designs. All GTS models supported SLI for enhanced and high-resolution gaming. The 8800 GT, utilizing the 65 nm G92 core, served as a mid-cycle refresh bridging the 8800 series to the subsequent 9 lineup, launching on October 29, 2007, at $349 USD (with street prices often at $299 USD). Equipped with 112 unified stream processors, a 600 MHz core clock, and 512 MB GDDR3 memory on a 256-bit bus, it delivered strong value for gaming while maintaining full SLI compatibility. Low-end extensions included the 8800 GS, a G92 variant with 96 stream processors and 384 MB GDDR3, released on January 31, 2008, aimed at budget upgrades. Additionally, the short-lived 8800 Ultra, an overclocked iteration of the GTX with a 612 MHz core and identical 768 MB GDDR3 setup, launched on May 2, 2007, at $829 USD, targeting extreme enthusiasts before being quickly overshadowed by newer architectures. Compatibility with PCIe interfaces occasionally presented bandwidth limitations in SLI modes, as noted in early reviews.

Mobile Graphics Processors

GeForce 8200M to 8600M Series

The GeForce 8200M and 8200M G were entry-level mobile graphics processors introduced as part of NVIDIA's GeForce 8M series, targeting ultraportable laptops and netbooks with integrated-like performance. Based on the MCP79MVL chipset integrated into motherboards, these GPUs featured shared system memory up to 256 MB of DDR2 or DDR3, a 400 MHz core clock, and support for DirectX 10 with Shader Model 4.0, all fabricated on an 80 nm process. Released on June 3, 2008, they emphasized passive cooling and low power consumption around 12 W TDP, making them suitable for thin-and-light designs without dedicated fans. The GeForce 8400M series, including the GS and GT variants, represented a step up in the low-end mobile segment, utilizing the G86 core adapted for laptops. Launched on May 9, 2007, these GPUs supported up to 512 MB of GDDR3 memory on a 128-bit bus (though commonly configured with 128-256 MB), with core clocks of 400 MHz for the GS and 450 MHz for the GT, alongside TurboCache technology to extend effective memory capacity using system RAM. Aimed at business and multimedia laptops, they delivered DirectX 10 compatibility for light gaming and video acceleration, with TDPs ranging from 14-20 W to balance performance and battery life— the GS model prioritized efficiency for extended runtime. Benchmarks from the era showed playable frame rates in older titles like Doom 3 and F.E.A.R. at 1024x768 resolution with reduced settings, outperforming integrated graphics of the time while maintaining portability. Building on this, the 8600M GS and GT provided mid-range capabilities within the power-constrained mobile environment, employing the G84 core with 16 pixel shaders for the GS and 32 for the GT. Introduced on May 1, 2007, they offered up to 512 MB of GDDR3 memory on a 128-bit interface, core clocks up to 475 MHz for the GT, and enhanced texture units for smoother 10 rendering in games. With TDPs of 20-25 W, the GS variant focused on power efficiency for longer battery sessions in gaming-oriented laptops, such as the M1530, while the GT targeted higher frame rates. These GPUs were commonly integrated into systems from OEMs like HP, , and , enabling playable performance at 1024x768 in contemporary titles like on low settings, though limited by thermal envelopes compared to desktop counterparts. Support for mobile Hybrid SLI allowed pairing with integrated graphics for modest multi-GPU boosts in select configurations.

GeForce 8700M and 8800M Series

The GeForce 8700M GT, a variant of the G84M graphics core, served as a high-end mobile GPU targeted at gaming laptops, featuring 32 unified shaders, a core clock up to 625 MHz, and shader clock of 1250 MHz. It supported 256 MB to 512 MB of GDDR3 memory on a 128-bit bus with speeds up to 800 MHz, delivering bandwidth of 25.6 GB/s, and operated at a thermal design power (TDP) ranging from 25 W to 35 W. Released in June 2007 and integrated into systems from manufacturers like Alienware and MSI, it emphasized power efficiency through NVIDIA's PowerMizer technology while enabling DirectX 10 gaming. The 8800M series represented NVIDIA's flagship mobile adaptation of the G80 architecture, with the 8800M GTX and GTS models launched in November 2007 as MXM-upgradable modules for premium laptops. The 8800M GTX utilized 128 unified shaders, a 256-bit GDDR3 memory interface supporting up to 1 GB at 800 MHz (51.2 GB/s bandwidth), and a TDP of 65 W, while the 8800M GTS offered a slightly scaled-down configuration with 64 shaders, 512 MB memory, and a 50 W TDP. These GPUs powered high-resolution gaming, including support for SLI configurations in select from vendors like , , and Sager, allowing dual-GPU setups for resolutions up to 1920x1200. In performance evaluations, the 8800M GTX delivered frame rates comparable to the desktop GeForce 8600 GT in DirectX 10 titles, achieving around 35 fps in at medium settings and 1024x768 resolution, though it required lowered details for demanding scenes to maintain playability. SLI variants further boosted output, enabling high-detail DirectX 10 gaming in notebooks with adequate cooling. Production of the 8700M and 8800M series concluded by early 2008, coinciding with the introduction of the GeForce 9M lineup, which succeeded these chips in mobile high-end segments.

Key Features and Technologies

Shader Architecture and DirectX Support

The GeForce 8 series introduced NVIDIA's Tesla microarchitecture, featuring a unified shader model that consolidated vertex, geometry, and pixel processing into a single programmable pipeline. This design utilized streaming multiprocessors (SMs), each equipped with 8 shader processors (also referred to as ALUs) capable of handling diverse shader tasks dynamically. The architecture supported concurrent execution of warps—groups of 32 threads—enabling efficient parallel processing across shader types without dedicated hardware silos, which improved resource utilization in varied workloads. A key aspect of this unified approach was its full support for DirectX 10, including geometry shaders that allowed developers to generate or modify primitives on the GPU, stream output for routing shader-generated data to memory buffers, and Shader Model 4.0 for enhanced programmability with features like integer operations and increased instruction limits (up to 64,000 per shader). DirectX 10 functionality required or later, as it relied on the operating system's updated graphics stack. The series also pioneered CUDA 1.0, NVIDIA's Compute Unified Device Architecture, which extended the unified shaders for general-purpose GPU (GPGPU) beyond graphics rendering. This enabled developers to write parallel programs for non-graphics tasks, such as scientific simulations, leveraging the same shader hardware with 16 KB of shared memory per SM for fast thread communication within thread blocks. Compared to DirectX 9-era architectures, the unified model delivered up to 11× scaling in specific shader operations on the GeForce 8800 versus the prior GeForce 7900 GTX, attributed to better load balancing, though it retained fixed-function units for with legacy DirectX 9 and earlier APIs. Complementing the shader advancements, the GeForce 8 series incorporated the third-generation (VP3) engine, a dedicated unit for hardware-accelerated decoding and post-processing. VP3 supported advanced spatial-temporal to convert interlaced HD content (such as H.264, , , and WMV9 formats) into for smoother playback on displays, alongside features like , , and inverse for 2:2/3:2 pull-down correction. This integration offloaded video tasks from the CPU, enhancing efficiency for high-definition media consumption.

Memory and Interface Innovations

The GeForce 8 series marked a significant advancement in memory subsystems by adopting GDDR3 SDRAM across its lineup, offering higher bandwidth compared to the GDDR2 and DDR2 memory types of prior generations. The flagship GeForce 8800 GTX utilized 768 MB of GDDR3 memory on a wide 384-bit interface, with effective data rates reaching 1.8 Gbps, resulting in a peak bandwidth of 86.4 GB/s that supported the demands of unified shader processing for enhanced texture and pixel operations. Lower-end models, such as those in the 8400 series, incorporated NVIDIA's TurboCache technology to supplement limited onboard memory—typically 256 MB DDR2—with shared system RAM, effectively expanding available graphics memory up to several gigabytes depending on the host system's configuration, which proved beneficial for multimedia and light gaming tasks. A key innovation in multi-GPU configurations was the introduction of Hybrid SLI, which enabled dynamic collaboration between a discrete GeForce 8 series GPU and an integrated GPU on motherboards, allowing seamless performance scaling for graphics-intensive applications. This technology included HybridPower functionality, which automatically switched to the lower-power integrated GPU for non-demanding tasks on systems featuring 8400 or 8600 series cards, thereby optimizing resource allocation without manual intervention. Display connectivity saw improvements with support for dual dual-link DVI outputs capable of resolutions up to 2560×1600, facilitating high-fidelity visuals on large flat-panel monitors across select 8800 and 8600 models. Additionally, the series integrated ports with (High-bandwidth Digital Content Protection) for secure playback of high-definition content, including Blu-ray and , while HDTV output was enabled via DVI-H adapters for component or connections, broadening compatibility with home theater setups. The GeForce 8 series connected via the 1.1 x16 interface, providing up to 8 GB/s of bidirectional bandwidth in full configuration to handle data transfers between the GPU and system memory. For mobile implementations, the GeForce 8800M series adopted the MXM () II form factor, a standardized upgradeable slot that allowed users to replace the GPU module in compatible notebooks, such as swapping a 8800M GTS for the higher-performance 8800M GTX without requiring a full system overhaul. This modularity, limited to systems with MXM II support, represented an early step toward user-serviceable mobile graphics upgrades.

Performance and Compatibility Issues

Hardware Limitations and Bugs

The GeForce 8800 GT and GTS 512MB models suffered from a PCIe 1.0a interface limitation when installed in x16 slots on compatible motherboards, effectively delivering bandwidth equivalent to x8 lanes and resulting in approximately 10-15% performance degradation in bandwidth-bound gaming scenarios. Early production runs of the G80-based GeForce 8800 GTX experienced significant overheating problems, causing visual artifacts such as colored lines or glitches during intensive use, which NVIDIA addressed through widespread RMA replacements or vBIOS updates to improve thermal management. SLI configurations with the 8800 series required NVIDIA-certified motherboards to ensure proper PCIe lane allocation and stability, as non-certified boards often failed to support dual-GPU setups adequately, leading to frame pacing inconsistencies in 10 applications. In mobile implementations, the 8800M GTX was prone to thermal throttling in slim laptops due to constrained cooling solutions, sometimes necessitating modifications to adjust power limits and fan curves for sustained performance. Additionally, the 320MB variant of the GeForce 8800 GTS exhibited VRAM overheating under prolonged load, contributing to instability, while early drivers for the series lacked reliable multi-monitor spanning support, preventing seamless desktop extension across displays without configuration workarounds.

Driver Support and End-of-Life

The GeForce 8 series received its initial official driver support through NVIDIA ForceWare version 97.02, released on November 8, 2006, which introduced 10 compatibility for the newly launched GeForce 8800 GTX and 8800 GTS GPUs. These drivers marked the beginning of the software ecosystem tailored to the series' unified architecture and advanced features like SLI support. Over the subsequent years, the driver lineage evolved from the 97.xx branch to the 100.xx and 200.xx series, reaching the 300.xx branch by mid-2010, incorporating optimizations for emerging games and Windows operating systems while maintaining for GeForce 8 hardware. NVIDIA's support for the GeForce 8 series transitioned to legacy status with the R340 driver branch, culminating in the final official release of version 342.01 on December 14, 2016, which provided security updates but no new features for 64-bit. Earlier in 2016, specifically on April 1, ceased active development and bug fixes for the series across all platforms, including dropping full support for and with the end of the R340 updates. For users, while extended OS-level security patches until January 2020, no additional driver updates were issued post-2016, leaving the hardware reliant on the final R340 version for stability and no new performance enhancements after approximately 2013. In modern operating systems like , GeForce 8 series GPUs operate in legacy mode using Microsoft's basic display adapter fallback, limited to 9 functionality due to the absence of official drivers supporting 12 or later requirements. The series lacks native support for , as this requires Kepler-generation hardware or newer, and is capped at 3.3, preventing compatibility with applications demanding 4.x or higher. Community-driven modifications, such as modified INF files or repackaged legacy drivers, have been developed to enable basic compatibility tweaks, but these are unofficial, unverified for security, and not endorsed by .

Technical Specifications

Core Configurations

The GeForce 8 series utilized several GPU cores based on NVIDIA's Tesla architecture, with configurations varying by performance tier and form factor. Desktop variants primarily employed the G80, G92, G84, and G86 chips, while mobile implementations adapted these for power efficiency, often with reduced clocks and pipelines to fit constraints. These cores featured unified processors, units (TMUs), and render output units (ROPs), enabling scalable performance across models from high-end to entry-level. Key differences in included nodes, counts, and die sizes, which influenced power draw and cost. The flagship G80, built on a , contained 681 million s across a 484 mm² die, supporting up to 128 unified shaders. In contrast, the later G92 shifted to a for better efficiency, packing 754 million s into a 324 mm² die with 112 shaders in its primary configuration. Mid-range G84 and entry-level G86 cores used an 80 nm , with 289 million s on a 169 mm² die for G84 and 210 million on a 127 mm² die for G86, featuring fewer shaders (32 and 16, respectively).
CoreProcess NodeDie Size (mm²)Transistors (millions)Example ModelCore Clock (MHz)Shader Clock (MHz)ShadersTMUsROPs
G8090 nm4846818800 GTX57513501283224
G9265 nm3247548800 GT60015001125616
G8480 nm1692898600 GT540118832168
G8680 nm1272108400 GS52013001684
Pipeline configurations scaled with the core's capabilities, directly impacting rendering throughput. For instance, the 8800 GTX's 24 ROPs enabled a pixel fill rate of 13.8 Gpixels/s, calculated as the core clock multiplied by the number of ROPs (575 MHz × 24). Lower-tier models like the 8400 GS, with 4 ROPs, achieved proportionally reduced rates around 2.08 Gpixels/s at similar clocks, prioritizing efficiency over peak performance. These setups allowed the series to support DirectX 10 while balancing complexity and yield. Mobile cores mirrored desktop designs but operated at lower clocks and power envelopes to suit integration, with G87 and G84M variants emphasizing thermal management. The G87, used in high-end models like the 8800M GTX, employed a with approximately 410 million transistors on a 250 mm² die, featuring 96 shaders at a 500 MHz core clock and 65 W TDP. G84M implementations, such as in the 8600M GT, featured 32 shaders at a 475 MHz core clock, with variants ranging up to approximately 600 MHz in optimized systems, and a TDP around 25-40 W.
CoreProcess NodeExample ModelCore Clock Range (MHz)Shader Clock (MHz)ShadersTMUsROPsTDP (W)
G8765 nm8800M GTX500125096481665
G84M80 nm8600M GT400-6009503216825-40
These mobile configurations retained the but reduced pipeline counts and clocks to achieve fill rates suitable for portable use, such as 8 Gpixels/s for the 8800M GTX (500 MHz × 16 ROPs), ensuring compatibility with desktop software while minimizing heat output.

Power and Thermal Characteristics

The GeForce 8 series GPUs varied significantly in power consumption, with high-end desktop models demanding substantial electrical input compared to and entry-level variants. The flagship GeForce 8800 GTX featured a (TDP) of 155 W, requiring a minimum 450 W power supply unit and two 6-pin PCIe connectors to supplement the 75 W provided by the PCIe slot. In comparison, the GeForce 8600 GT operated at a more modest TDP of 47 W, typically relying solely on PCIe slot power without additional connectors. Entry-level models like the GeForce 8400 GS further reduced demands, with TDPs around 40 W and no need for external power. For mobile implementations, the high-end GeForce 8800M GTX targeted a TDP of 65 W, though certain configurations could approach 95 W under maximum load due to integrated constraints. High-end models generally used 6-pin or 6+2-pin connectors where applicable, while low-end variants omitted them entirely to simplify integration. Efficiency metrics for the series highlighted the trade-offs of the 90 nm G80 architecture, with the GeForce 8800 GTX delivering approximately 2.23 GFLOPS per watt based on its 345.6 GFLOPS peak floating-point performance and 155 W TDP. Subsequent process shrinks to 65 nm in the G92-based models, such as the GeForce 8800 GT, improved this to around 3.20 GFLOPS per watt (336 GFLOPS at 105 W TDP), reflecting better power utilization through reduced transistor leakage and optimized clocking. These gains were incremental but notable for the era, enabling sustained performance without proportional increases in heat output. Cooling solutions were tailored to each model's power profile, with NVIDIA's reference designs emphasizing reliability under load. The 8800 series employed a dual-fan active cooler on high-end cards like the 8800 GTX to dissipate up to 155 effectively, maintaining core temperatures below 80°C in stock operation. Lower-power options, such as the 8400 GS, often utilized passive heatsinks for silent operation, leveraging ambient case airflow for TDPs under 50 . Mobile high-end variants in the 8800M series incorporated vapor chamber technology in premium laptops to evenly distribute heat across compact , preventing hotspots and supporting sustained 65 operation without excessive throttling. Overclocking potential was constrained by thermal limits, particularly on the power-hungry G80 core. The GeForce 8800 GTX could reach core clocks of 700 MHz with aftermarket cooling solutions, a 17% increase over stock, but this often pushed temperatures toward 110°C under prolonged loads without enhanced airflow or undervolting. Such extremes risked thermal throttling or reduced lifespan, underscoring the need for robust cooling upgrades on high-TDP models.
ModelTDP (W)Power ConnectorsRecommended PSU (W)
GeForce 8800 GTX1552x 6-pin450
GeForce 8600 GT47None300
GeForce 8800M GTX65 (up to 95 max)Integrated (laptop-dependent)N/A

References

Add your contribution
Related Hubs
User Avatar
No comments yet.