Recent from talks
Nothing was collected or created yet.
GeForce FX series
View on Wikipedia
Top: Logo of the series Bottom: A GeForce FX 5950 Ultra released in 2003, the series' flagship model | |
| Release date | January 27, 2003 |
|---|---|
| Codename | NV30, NV31, NV34, NV35, NV36, NV38 |
| Architecture | Rankine |
| Models | GeForce FX series
|
| Cards | |
| Entry-level | FX 5100 FX 5200 FX 5200 LE FX 5300 FX 5500 |
| Mid-range | FX 5600 FX 5700 PCX 5750 |
| High-end | FX 5800 FX 5900 PCX 5950 |
| Enthusiast | 5800 Ultra, 5900 Ultra, 5950 Ultra |
| API support | |
| Direct3D | Direct3D 9.0a Shader Model 2.0a |
| OpenGL | OpenGL 2.1 |
| History | |
| Predecessor | GeForce 4 series |
| Successor | GeForce 6 series |
| Support status | |
| Unsupported | |
The GeForce FX or "GeForce 5" series (codenamed NV30) is a line of graphics processing units from the manufacturer Nvidia.
Overview
[edit]Nvidia's GeForce FX series is the fifth generation of the GeForce line. With GeForce 3, the company introduced programmable shader functionality into their 3D architecture, in line with the release of Microsoft's DirectX 8.0. The GeForce 4 Ti was an enhancement of the GeForce 3 technology. With real-time 3D graphics technology continually advancing, the release of DirectX 9.0 brought further refinement of programmable pipeline technology with the arrival of Shader Model 2.0. The GeForce FX series is Nvidia's first generation Direct3D 9-compliant hardware.
The series was manufactured on TSMC's 130 nm fabrication process.[1] It is compliant with Shader Model 2.0/2.0A, allowing more flexibility in complex shader/fragment programs and much higher arithmetic precision. It supports a number of new memory technologies, including DDR2, GDDR2 and GDDR3 and saw Nvidia's first implementation of a memory data bus wider than 128 bits.[2] The anisotropic filtering implementation has potentially higher quality than previous Nvidia designs.[1] Anti-aliasing methods have been enhanced and additional modes are available compared to GeForce 4.[1] Memory bandwidth and fill-rate optimization mechanisms have been improved.[1] Some members of the series offer double fill-rate in z-buffer/stencil-only passes.[2]
The series also brought improvements to the company's video processing hardware, in the form of the Video Processing Engine (VPE), which was first deployed in the GeForce 4 MX.[3] The primary addition, compared to previous Nvidia GPUs, was per-pixel video-deinterlacing.[3]
The initial version of the GeForce FX (the 5800) was one of the first cards to come equipped with a large dual-slot cooler. Called "Flow FX", the cooler was very large in comparison to ATI's small, single-slot cooler on the 9700 series.[4] It was jokingly referred to as the "Dustbuster", due to a high level of fan noise.[5]
The advertising campaign for the GeForce FX featured the Dawn, which was the work of several veterans from the computer animation Final Fantasy: The Spirits Within.[6] Nvidia touted it as "The Dawn of Cinematic Computing".[7]
Nvidia debuted a new campaign to motivate developers to optimize their titles for Nvidia hardware at the Game Developers Conference (GDC) in 2002. In exchange for prominently displaying the Nvidia logo on the outside of the game packaging, the company offered free access to a state-of-the-art test lab in Eastern Europe, that tested against 500 different PC configurations for compatibility. Developers also had extensive access to Nvidia engineers, who helped produce code optimized for the company's products.[8]
Hardware based on the NV30 project didn't launch until near the end of 2002, several months after ATI had released their competing DirectX 9 architecture.[9]
Overall performance
[edit]GeForce FX is an architecture designed with DirectX 7, 8 and 9 software in mind. Its performance for DirectX 7 and 8 was generally equal to ATI's competing products with the mainstream versions of the chips, and somewhat faster in the case of the 5900 and 5950 models, but it is much less competitive across the entire range for software that primarily uses DirectX 9 features.[10]
Its weak performance in processing Shader Model 2 programs is caused by several factors. The NV3x design has less overall parallelism and calculation throughput than its competitors.[11] It is more difficult, compared to GeForce 6 and ATI Radeon R300 series, to achieve high efficiency with the architecture due to architectural weaknesses and a resulting heavy reliance on optimized pixel shader code. While the architecture was compliant overall with the DirectX 9 specification, it was optimized for performance with 16-bit shader code, which is less than the 24-bit minimum that the standard requires. When 32-bit shader code is used, the architecture's performance is severely hampered.[11] Proper instruction ordering and instruction composition of shader code is critical for making most of the available computational resources.[11]
Hardware refreshes and diversification
[edit]

Nvidia's initial release, the GeForce FX 5800, was intended as a high-end part. At the time, there were no GeForce FX products for the other segments of the market. The GeForce 4 MX continued in its role as the budget video card and the older GeForce 4 Ti cards filled in the mid-range.
In April 2003, the company introduced the GeForce FX 5600 and the GeForce FX 5200 to address the other market segments. Each had an "Ultra" variant and a slower, budget-oriented variant and all used conventional single-slot cooling solutions. The 5600 Ultra had respectable performance overall but it was slower than the Radeon 9600 Pro and sometimes slower than the GeForce 4 Ti series.[12] The FX 5200 did not perform as well as the DirectX 7.0 generation GeForce 4 MX440 or Radeon 9000 Pro in some benchmarks.[13]
In May 2003, Nvidia launched the GeForce FX 5900 Ultra, a new high-end product to replace the low-volume and disappointing FX 5800. Based upon a revised GPU called NV35, which fixed some of the DirectX 9 shortcomings of the discontinued NV30, this product was more competitive with the Radeon 9700 and 9800.[14] In addition to redesigning parts of the GPU, the company moved to a 256-bit memory data bus, allowing for significantly higher memory bandwidth than the 5800 even when utilizing more common DDR SDRAM instead of DDR2.[14] The 5900 Ultra performed somewhat better than the Radeon 9800 Pro in games not heavily using shader model 2, and had a quieter cooling system than the 5800.[14]
In October 2003, Nvidia released the GeForce FX 5700 and GeForce FX 5950. The 5700 was a mid-range card using the NV36 GPU with technology from NV35 while the 5950 was a high-end card again using the NV35 GPU but with additional clock speed. The 5950 also featured a redesigned version of the 5800's FlowFX cooler, this time using a larger, slower fan and running much quieter as a result. The 5700 provided strong competition for the Radeon 9600 XT in games limited to light use of shader model 2.[15] The 5950 was competitive with the Radeon 9800 XT, again as long as pixel shaders were lightly used.[16]
In December 2003, the company launched the GeForce FX 5900XT, a graphics card intended for the mid-range segment. It was similar to the 5900 Ultra, but clocked slower and used slower memory. It more thoroughly competed with Radeon 9600 XT, but was still behind in a few shader-intense scenarios.[17]
The GeForce FX line moved to PCI Express in early 2004 with a number of models, including the PCX 5300, PCX 5750, PCX 5900, and PCX 5950. These cards were largely the same as their AGP predecessors with similar model numbers. To operate on the PCIe bus, an AGP-to-PCIe "HSI bridge" chip on the video card converted the PCIe signals into AGP signals for the GPU.[18]
Also in 2004, the GeForce FX 5200 / 5300 series that utilized the NV34 GPU received a new member with the FX 5500.[19]
GeForce FX model information
[edit]- All models support OpenGL 1.5 (2.1 (software) with latest drivers)
- The GeForce FX series runs vertex shaders in an array
| Model | Launch | Transistors (million)
|
Die size (mm2)
|
Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | TDP (Watts)
| ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
| ||||||||||||
| GeForce FX 5100 | Mar 2003 | NV34 | TSMC 150 nm | 45[21] | 124 | AGP 8x | 200 | 166 | 4:2:4:4 | 800 | 800 | 800 | 100.0 | 64 128 |
2.6 | DDR | 64 | 12.0 | ? |
| GeForce FX 5200 LE | 250 | 1,000 | 1,000 | 1,000 | 125.0 | 64 128 256 |
2.6 5.3 |
64 128 |
15.0 | ? | |||||||||
| GeForce FX 5200 | AGP 8x PCI |
200 | 3.2 6.4 |
64 128 |
21 | ||||||||||||||
| GeForce FX 5200 Ultra | 6 Mar 2003 | AGP 8x | 325 | 325 | 1,300 | 1,300 | 1,300 | 162.5 | 10.4 | 128 | 19.5 | 32 | |||||||
| GeForce PCX 5300 | 17 Mar 2004 | PCIe x16 | 250 | 166 | 1,000 | 1,000 | 1,000 | 125.0 | 128 256 |
2.6 | 64 | 15.0 | 21 | ||||||
| GeForce FX 5500 | Mar 2004 | NV34B | 45[22] | 91 | AGP 8x AGP 4x PCI |
270 | 166 200 |
1,080 | 1,080 | 1,080 | 135.0 | 64 128 256 |
5.3 6.4 |
128 | 16.2 | ? | |||
| GeForce FX 5600 XT | Oct 2003 | NV31 | TSMC 130 nm | 80[23] | 121 | AGP 8x | 235 | 200 | 940 | 940 | 940 | 117.5 | 64 128 |
3.2 6.4 |
64 128 |
14.1 | ? | ||
| GeForce FX 5600 | Mar 2003 | AGP 8x PCI |
325 | 275 | 1,300 | 1,300 | 1,300 | 162.5 | 64 128 256[24] |
8.8 | 128 | 19.5 | 25 | ||||||
| GeForce FX 5600 Ultra | 6 Mar 2003 | AGP 8x | 350 | 350 | 1,400 | 1,400 | 1,400 | 175.0 | 64 128 |
11.2 | 21.0 | 27 | |||||||
| GeForce FX 5600 Ultra Rev.2 | 400 | 400 | 1,600 | 1,600 | 1,600 | 200.0 | 12.8 | 24.0 | 31 | ||||||||||
| GeForce FX 5700 VE | Sep 2004 | NV36 | 82[25] | 133 | 250 | 200 | 4:3:4:4 | 1000 | 1000 | 1000 | 187.5 | 128 256 |
3.2 6.4 |
64 128 |
17.5 | 20 | |||
| GeForce FX 5700 LE | Mar 2004 | AGP 8x PCI |
21 | ||||||||||||||||
| GeForce FX 5700 | 2003 | AGP 8x | 425 | 250 | 1,700 | 1,700 | 1,700 | 318.7 | 8.0 | 128 | 29.7 | 20 | |||||||
| GeForce PCX 5750 | 17 Mar 2004 | PCIe x16 | 128 | 25 | |||||||||||||||
| GeForce FX 5700 Ultra | 23 Oct 2003 | AGP 8x | 475 | 453 | 1,900 | 1,900 | 1,900 | 356.2 | 128 256 |
14.4 | GDDR2 | 33.2 | 43 | ||||||
| GeForce FX 5700 Ultra GDDR3 | 15 Mar 2004 | 475 | 15.2 | GDDR3 | 38 | ||||||||||||||
| GeForce FX 5800 | 27 Jan 2003 | NV30 | 125[26] | 199 | 400 | 400 | 4:2:8:4 | 1,600 | 1,600 | 3,200 | 300.0 | 128 | 12.8 | GDDR2 | 24.0 | 55 | |||
| GeForce FX 5800 Ultra | 500 | 500 | 2,000 | 2,000 | 4,000 | 375.0 | 16.0 | 30.0 | 66 | ||||||||||
| GeForce FX 5900 ZT | 15 Dec 2003 | NV35 | 135[27] | 207 | 325 | 350 | 4:3:8:4 | 1,300 | 1,300 | 2,600 | 243.7 | 22.4 | DDR | 256 | 22.7 | ? | |||
| GeForce FX 5900 XT | 15 Dec 2003[28] | 390 | 1,600 | 1,600 | 3,200 | 300.0 | 27.3 | 48 | |||||||||||
| GeForce FX 5900 | May 2003 | 400 | 425 | 27.2 | 28.0 | 55 | |||||||||||||
| GeForce FX 5900 Ultra | 12 May 2003 | 450 | 1,800 | 1,800 | 3,600 | 337.5 | 128 256 |
31.5 | 65 | ||||||||||
| GeForce PCX 5900 | 17 Mar 2004 | PCIe x16 | 350 | 275 | 1,400 | 1,400 | 2,800 | 262.5 | 17.6 | 24.5 | 49 | ||||||||
| GeForce FX 5950 Ultra | 23 Oct 2003 | NV38 | 135[29] | 207 | AGP 8x | 475 | 475 | 1,900 | 1,900 | 3,800 | 356.2 | 256 | 30.4 | 33.2 | 83 | ||||
| GeForce PCX 5950 | 17 Feb 2004 | PCIe x16 | 425 | 27.2 | GDDR3 | 83 | |||||||||||||
| Model | Launch | Transistors (million)
|
Die size (mm2)
|
Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | TDP (Watts)
| ||||||||||
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
| ||||||||||||
GeForce FX Go 5 (Go 5xxx) series
[edit]The GeForce FX Go 5 series for notebooks architecture.
- 1 Vertex shaders: pixel shaders: texture mapping units: render output units
- * The GeForce FX series runs vertex shaders in an array
- ** GeForce FX series has limited OpenGL 2.1 support(with the last Windows XP driver released for it, 175.19).
| Model | Launch | Fab (nm)
|
Core clock (MHz)
|
Memory clock (MHz)
|
Core config1
|
Fillrate | Memory | Supported API version | TDP (Watts)
| ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Pixel (GP/s)
|
Texture (GT/s)
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
|
OpenGL | |||||||||||
Hardware
|
Drivers (Software)
| ||||||||||||||||
| GeForce FX Go 5100* | Mar 2003 | NV34M | 150 | AGP 8x | 200 | 400 | 4:2:4:4 | 0.8 | 0.8 | 64 | 3.2 | DDR | 64 | 9.0 | 1.5 | 2.1** | Unknown |
| GeForce FX Go 5500* | 300 | 600 | 1.2 | 1.2 | 32 64 |
9.6 | 128 | Unknown | |||||||||
| GeForce FX Go 5600* | NV31M | 130 | 350 | 1.4 | 1.4 | 32 | Unknown | ||||||||||
| GeForce FX Go 5650* | 350 | Unknown | |||||||||||||||
| GeForce FX Go 5700* | 1 Feb 2005 | NV36M | 450 | 550 | 4:3:4:4 | 1.8 | 1.8 | 8.8 | Unknown | ||||||||
Support
[edit]NVIDIA has ceased driver support for GeForce FX series.
Final drivers
[edit]- Windows 9x & Windows Me: 81.98 released on December 21, 2005; Download;
- Product Support List Windows 95/98/Me – 81.98.
- Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems; no new official releases were later made for these systems.
- Windows 2000, 32-bit Windows XP & Media Center Edition: 175.19 released on June 23, 2008; Download.
- Note that the 175.19 driver is known to break Windows Remote Desktop (RDP).[30] The last version before the problem is 174.74. This was apparently fixed in 177.83, however this version is not available for the GeForce FX series of graphic cards.[31] Also worthwhile to note is that 163.75 is the last known good driver that correctly handles the adjustment of the video overlay color properties for the GeForce FX series. Subsequent WHQL drivers do not handle the whole range of possible video overlay adjustments (169.21) or have no effect on those (175.xx).
- Windows XP (32-bit): 175.40 released on August 1, 2008; Download.
- Windows Vista (32-bit): 96.85 released on October 17, 2006; Download;
- Windows Vista (64-bit): 97.34 released on November 21, 2006; Download.
- Linux/BSD/Solaris: 169.12 released on February 26, 2008; Download.
- Also available: 177.67 (beta) released on August 19, 2008; Download.
The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.
- (Products supported list also on this page)
Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive
Unix Driver Archive
See also
[edit]References
[edit]- ^ a b c d Lal Shimpi, Anand (November 18, 2002). "Nvidia Introduces GeForce FX (NV30)". Anandtech. Retrieved August 25, 2010.
{{cite web}}: CS1 maint: deprecated archival service (link) - ^ a b Barkovoi, Aleksei; Vorobiev, Andrey (2003). "Nvidia GeForce FX 5900 Ultra 256MB Video Card Review". X-bit labs. Retrieved August 25, 2010.
- ^ a b "Video Processing Engine". NVIDIA. Retrieved August 25, 2010.
- ^ Wasson, Scott (April 7, 2003). "NVIDIA's GeForce FX 5800 Ultra GPU". Tech Report. Retrieved June 14, 2008.
- ^ From Voodoo to GeForce: The Awesome History of 3D Graphics
- ^ "Dawn Demo". NVIDIA. Retrieved August 25, 2010.
- ^ "Cinematic Computing For Every User" (PDF). NVIDIA. Archived from the original (PDF) on July 14, 2011. Retrieved August 25, 2010.
- ^ Ferret, Wily (May 4, 2007). "Post-Nvidia man writes in". The Inquirer. Archived from the original on September 22, 2007. Retrieved June 14, 2008.
- ^ Lal Shimpi, Anand (January 27, 2003). "Nvidia GeForce FX 5800 Ultra: It's Here, but is it Good?". Anandtech. Retrieved August 25, 2010.
{{cite web}}: CS1 maint: deprecated archival service (link) - ^ Cross, Jason. Benchmarking Half-Life 2: ATI vs. NVIDIA Archived December 14, 2005, at the Wayback Machine, ExtremeTech, November 29, 2004.
- ^ a b c Demirug. CineFX (NV30) Inside, 3DCenter, August 31, 2003.
- ^ Gasior, Geoff (May 6, 2003). "Nvidia's GeForce FX 5600 GPU". Tech Report. Archived from the original on May 29, 2008. Retrieved June 14, 2008.
- ^ Gasior, Geoff (April 29, 2003). "Nvidia's GeForce FX 5200 GPU". Tech Report. Retrieved June 14, 2008.
- ^ a b c Bell, Brandon (June 20, 2003). "eVGA e-GeForce FX 5900 Ultra Review". FiringSquad. Archived from the original on September 29, 2007. Retrieved June 14, 2008.
- ^ Gasior, Geoff (October 23, 2003). "Nvidia's GeForce FX 5700 Ultra GPU". Tech Report. Archived from the original on June 15, 2008. Retrieved June 14, 2008.
- ^ Hagedoorn, Hilbert (October 23, 2003). "GeForce FX 5700 Ultra & 5950 Ultra Review". Guru3D. Archived from the original on August 20, 2007. Retrieved June 14, 2008.
- ^ Gasior, Geoff (December 15, 2003). "NVIDIA's GeForce FX 5900 XT GPU". Tech Report. Retrieved June 14, 2008.
- ^ Timofeeva, Anna (April 8, 2004). "Gigabyte GeForce PCX 5900 Video Card Review". Digital-Daily. Retrieved August 25, 2010.
- ^ Hagedoorn, Hilbert (March 9, 2004). "PoV GeForce FX 5500 Review". Digital-Daily. Archived from the original on August 9, 2010. Retrieved August 25, 2010.
- ^ a b "3D accelerator database". Vintage 3D. Archived from the original on October 23, 2018. Retrieved August 30, 2024.
- ^ "NVIDIA NV34 GPU Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce FX 5500 PCI Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce FX 5600 XT PCI Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "GeForce FX 5600". Nvidia.com. Archived from the original on September 25, 2015. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce FX 5700 LE Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce FX 5800 Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce FX 5900 Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA and Activision Launch "Call of Duty" Bundle with GeForce FX 5900 Graphics Cards". Archived from the original on April 1, 2017. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce FX 5950 Ultra Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ User forum complaints about v175.19 driver breaking RDP
- ^ AnandTech forum post regarding RDP issue
External links
[edit]- Nvidia: Cinematic Computing for Every User
- ForceWare 81.98 drivers, Final Windows 9x/ME driver release
- Geforce 175.19 drivers, Final Windows XP driver release
- Museum of Interesting Tech article Picture and specifications for the FX5800
- Driver Downloads
- laptopvideo2go.com Contains an archive of drivers and modified .INF files for the GeForce FX series
- techPowerUp! GPU Database
GeForce FX series
View on GrokipediaIntroduction and Development
Background and Announcement
The GeForce FX series represented NVIDIA's strategic response to ATI Technologies' Radeon 9700 series, which had launched in August 2002 as the first graphics card to fully support Microsoft DirectX 9's programmable shader model, moving away from fixed-function pipelines toward more flexible, developer-controlled rendering.[7] To regain market leadership in high-end graphics, NVIDIA developed the NV3x architecture to deliver comparable programmable vertex and pixel shaders, enabling cinematic effects like advanced lighting and procedural textures that were previously limited by hardware constraints. This shift aimed to empower game developers with tools for more realistic visuals, directly challenging ATI's early dominance in the DirectX 9 era.[8] NVIDIA officially announced the GeForce FX series on November 18, 2002, at the Comdex trade show in Las Vegas, positioning it as a breakthrough for "cinematic computing" in PC gaming.[8] The event garnered support from major game studios including Electronic Arts, id Software, and Epic Games, who praised the series' potential to elevate game visuals to Hollywood standards. NVIDIA executive Bill Rehbock emphasized that the GeForce FX would "unleash no-holds-barred creative talent" with its CineFX engine and Cg shading language, targeting enthusiast gamers seeking immersive, high-fidelity experiences.[9] Development of the GeForce FX, codenamed NV30, faced significant challenges during the transition from the NV2x architecture of the GeForce4 series, including issues with TSMC's 0.13-micron manufacturing process and the integration of low-k dielectric materials that were later abandoned for a standard process. Rumors of the chip circulated throughout 2002, with NVIDIA initially promising a fall launch to counter ATI's momentum, but persistent yield problems and architectural refinements delayed availability to early 2003. These setbacks stemmed from the ambitious goal of balancing high transistor counts—125 million in the flagship NV30—with power efficiency and thermal management in a competitive landscape.[8] The initial pricing strategy focused on the premium segment, with the high-end GeForce FX 5800 targeted at around $400 to appeal to performance-oriented enthusiasts willing to invest in cutting-edge technology.[8] This positioned the series as an aspirational upgrade for gamers prioritizing future-proof features over immediate value, aligning with NVIDIA's emphasis on innovation for the core PC gaming community.Launch Timeline and Initial Market Position
The GeForce FX series marked NVIDIA's entry into the DirectX 9 era, with the flagship GeForce FX 5800 Ultra launching on March 6, 2003, at a recommended price of $499. This initial high-end model was followed by the GeForce FX 5900 series in May 2003, priced at around $399 for the base variant, and lower-tier options such as the FX 5200 (March 2003) and FX 5600 (April 2003) to broaden market accessibility. These releases came nearly a year after ATI's Radeon 9700 (R300) debut in August 2002, positioning NVIDIA as a late challenger in the high-performance discrete GPU segment.[10][11][12] Upon launch, the FX series faced intense competition from ATI's maturing R300 lineup, including the Radeon 9800 PRO released in March 2003, which often outperformed the FX 5800 Ultra in benchmarks while consuming less power. NVIDIA aimed to reclaim market leadership by emphasizing the FX's programmable shader capabilities for emerging DirectX 9 titles, but ATI captured a slight majority of the discrete GPU market share by Q4 2003. Meanwhile, Intel's push into integrated graphics via chipsets like the 865 series with Extreme Graphics captured 67% of the desktop integrated segment by late 2003, pressuring discrete vendors at the low end by bundling basic visuals into affordable OEM systems.[13][14] Early reception was marred by controversies, including reports of excessive heat generation and fan noise on the FX 5800 Ultra, which drew around 85W and required aggressive cooling that reviewers described as "jet engine"-like. Driver instability further compounded issues, with initial ForceWare releases exhibiting precision errors in pixel shaders, leading to suboptimal DirectX 9 performance and artifacts in games. NVIDIA responded by quickly revising the reference cooler and issuing driver updates, but these problems contributed to the FX 5800's short shelf life, with production halted by mid-2003 in favor of the improved FX 5900.[15][16] Sales for the FX series in 2003-2004 reflected a mixed initial position, with NVIDIA reporting total revenue of $1.82 billion in fiscal 2004 (ending January 2004), driven partly by FX adoption but tempered by competitive losses. The series solidified NVIDIA's role as a flagship provider for DirectX 9 gaming, notably supporting id Software's Doom 3 upon its 2004 release, where FX cards delivered playable frame rates with advanced shading effects that highlighted their pixel shader strengths. Despite early hurdles, the lineup helped NVIDIA maintain a strong presence in enthusiast segments, shipping millions of units amid the transition to shader-heavy titles.[17]Technical Architecture
Core Design and Specifications
The GeForce FX series utilizes NVIDIA's NV3x graphics processing unit family, with the NV30 serving as the foundational core fabricated on TSMC's 0.13-micron (130 nm) process technology. This chip integrates 125 million transistors across a die area of 199 mm², enabling a balance of performance and efficiency for DirectX 9-compliant rendering. Subsequent variants like the NV35 refined this design while maintaining core architectural principles, with later cores such as NV36 (for mid-range) and NV38 (for high-end with 256-bit memory) featuring similar transistor counts around 130 million.[2][18][19] Core clock speeds across the NV3x lineup range from approximately 250 MHz in entry-level implementations to 500 MHz in high-end configurations, such as the GeForce FX 5800 Ultra, allowing for scalable performance tailored to different market segments. Memory interfaces vary from 64-bit to 256-bit across models, paired with DDR or GDDR2 (DDR-II) SDRAM, supporting capacities up to 256 MB and delivering bandwidth ranging from about 3.2 GB/s to 30.4 GB/s based on memory clock rates up to 1000 MHz effective.[20][19][21] The pipeline architecture consists of 4 pixel pipelines equipped with 8 texture mapping units and 4 render output units, capable of processing up to 8 pixels and 8 texture samples per clock cycle under optimal conditions, though complex operations like multi-texturing may limit effective throughput to 4 textures per cycle. Integrated shader hardware includes 4 pixel shader units and 3 vertex shader units, providing foundational support for programmable shading effects.[22][23] High-end NV3x cores demand significant power, with thermal design power around 74 W in top models, driving the adoption of advanced thermal management. NVIDIA's FlowFX cooling innovations, featuring copper-based heatsinks, integrated heat pipes, and ducted fan assemblies, address these demands by enhancing airflow and heat dissipation across dual-slot cooler designs.[19][23]Shader Pipeline and Rendering Features
The GeForce FX series introduced NVIDIA's CinéFX 2.0 engine, which provided comprehensive support for DirectX 9's vertex shader 2.0 and pixel shader 2.0 models, enabling developers to create more sophisticated programmable shading effects. This architecture allowed for up to 65,536 instructions in vertex shaders and 1,024 instructions in pixel shaders, facilitating complex transformations and per-pixel lighting calculations that were previously limited in earlier generations.[24] A key innovation was the native support for floating-point precision, including FP16 for high-speed operations and FP32 for higher accuracy, which enabled cinematic effects such as high dynamic range (HDR) rendering and realistic material simulations without precision artifacts in most scenarios. The shader pipeline in the GeForce FX series incorporated dynamic branching capabilities, particularly in vertex shaders, allowing conditional execution and loops based on per-vertex data to optimize processing for varied geometry. Pixel shaders supported conditional execution through predicates, though full dynamic branching was more constrained compared to subsequent architectures.[24] These features extended to advanced texturing techniques, including support for bump mapping via dot-product 3 (DOT3) operations and higher-order methods in programmable shaders, enhancing surface detail without additional polygon geometry. Additionally, the series fully supported Microsoft's High-Level Shading Language (HLSL) under DirectX 9, permitting developers to write shaders in a C-like syntax that compiled to the underlying assembly, streamlining development for effects like procedural textures and dynamic lighting. Rendering enhancements in the GeForce FX series included IntelliSample technology, NVIDIA's proprietary approach to antialiasing and texture filtering, which improved image quality by reducing jagged edges and aliasing in transparent elements.[25] This implementation supported multisample antialiasing modes up to 8x, along with transparency antialiasing for alpha-blended objects like foliage and particles, delivering smoother visuals in real-time applications. While effective for its era, the GeForce FX's handling of complex pixel shaders proved inefficient relative to later architectures, as its scalar pipeline often processed instructions at full FP32 precision even when FP16 sufficed, leading to bottlenecks in shader-intensive workloads.[26] This limitation stemmed from the design's emphasis on cinematic precision over optimized throughput for branching-heavy or long-instruction shaders.Product Models
Desktop Variants
The GeForce FX series for desktop systems encompassed a range of graphics processing units (GPUs) targeted at high-end, mid-range, and entry-level markets, all built on derivatives of the NV30 architecture introduced in 2003.[5] These models utilized advanced features like pixel shader 2.0 support and CineFX engine for DirectX 9 compatibility, with variations in core clock speeds, memory configurations, and manufacturing processes to address different performance segments.[27]High-End Models
The flagship desktop variant was the GeForce FX 5950 Ultra, released in October 2003 and based on the NV38 core fabricated at 130 nm. It featured a core clock of 475 MHz, 256 MB of DDR memory clocked at 475 MHz (effective 950 MHz) across a 256-bit interface, 8 pixel shaders, 3 vertex shaders, 8 texture mapping units (TMUs), and 4 render output units (ROPs), with a transistor count of 135 million.[19][28] This model was positioned as NVIDIA's top-tier offering for enthusiasts, emphasizing improved yields and thermal efficiency over earlier high-end FX cards.[29] Earlier high-end models included the GeForce FX 5900 Ultra and FX 5900 XT, both using the NV35 core at 130 nm as a die-shrunk refresh from the initial 150 nm NV30 process. The FX 5900 Ultra, launched in May 2003, operated at a 450 MHz core clock with 128 MB or 256 MB DDR memory at 425 MHz (850 MHz effective) on a 256-bit bus, incorporating 8 pixel shaders, 3 vertex shaders, 8 TMUs, and 4 ROPs with 135 million transistors.[30][31] The FX 5900 XT, introduced in January 2004, ran at a slightly lower 400 MHz core clock with 128 MB DDR at 350 MHz (700 MHz effective) on the same 256-bit bus, incorporating 8 pixel shaders, 3 vertex shaders, 8 TMUs, and 4 ROPs, targeting value-oriented high-end users while maintaining full FX feature parity.[32][33] These refreshes improved production efficiency and reduced power draw compared to the original FX 5800 Ultra on the 150 nm NV30.[34]Mid-Range Models
The mid-range lineup centered on the GeForce FX 5700 series, powered by the NV36 core at 130 nm and released starting in August 2003. The FX 5700 Ultra variant featured a 475 MHz core clock, 128 MB GDDR2 memory at 450 MHz (900 MHz effective) via a 128-bit interface, 4 pixel shaders, 3 vertex shaders, 4 TMUs, and 4 ROPs, with 82 million transistors.[35][36] The GeForce FX 5600, launched in March 2003 on the NV31 core at 130 nm, offered a 350 MHz core clock with 128 or 256 MB DDR at 500 MHz (1000 MHz effective) on a 128-bit bus, 4 pixel shaders, 1 vertex shader, 4 TMUs, and 4 ROPs, with 80 million transistors, serving as an upper entry-level option. Lower-tier options like the FX 5700 and FX 5700 LE adjusted clocks downward for cost efficiency—the standard FX 5700 at around 400 MHz core and the LE at 250-300 MHz core with DDR memory options up to 256 MB—while retaining the core's shader capabilities for mainstream gaming.[37] This series balanced performance and affordability, often bundled with 128 MB or 256 MB configurations to support resolutions up to 2048x1536.[38]Entry-Level Models
Entry-level desktop variants included the GeForce FX 5200 and FX 5500, designed for budget systems and basic multimedia tasks. The FX 5200, launched in March 2003 on the NV34 core at 150 nm, had a 250 MHz core clock, 64 MB DDR memory at 200 MHz (400 MHz effective) on a 128-bit bus (with some 64-bit variants), 4 pixel shaders, 1 vertex shader, 4 TMUs, and 4 ROPs, totaling 45 million transistors.[21] The FX 5500, introduced in 2004 using the NV34 core also at 150 nm, offered a modest upgrade with a 270 MHz core clock, 64-128 MB DDR at 200 MHz on a 128-bit bus, and the same shader/TMU/ROP setup but with 45 million transistors for slightly better efficiency in low-end applications.[39][21] These models prioritized compatibility with AGP interfaces and supported DirectX 9 features without the power demands of higher tiers.[40]| Model | Core | Process | Core Clock | Memory | Bus Width | Release Date |
|---|---|---|---|---|---|---|
| FX 5950 Ultra | NV38 | 130 nm | 475 MHz | 256 MB DDR (950 MHz eff.) | 256-bit | Oct 2003 |
| FX 5900 Ultra | NV35 | 130 nm | 450 MHz | 128/256 MB DDR (850 MHz eff.) | 256-bit | May 2003 |
| FX 5900 XT | NV35 | 130 nm | 400 MHz | 128 MB DDR (700 MHz eff.) | 256-bit | Jan 2004 |
| FX 5700 Ultra | NV36 | 130 nm | 475 MHz | 128 MB GDDR2 (900 MHz eff.) | 128-bit | Aug 2003 |
| FX 5700 LE | NV36 | 130 nm | 250-300 MHz | 128/256 MB DDR | 128-bit | 2004 |
| FX 5200 | NV34 | 150 nm | 250 MHz | 64 MB DDR (400 MHz eff.) | 128-bit | Mar 2003 |
| FX 5500 | NV34 | 150 nm | 270 MHz | 64-128 MB DDR (400 MHz eff.) | 128-bit | 2004 |