Recent from talks
Nothing was collected or created yet.
GeForce 4 series
View on Wikipedia
![]() Logo | |
| Release date | February 6, 2002 |
|---|---|
| Codename | NV17, NV18, NV19, NV25, NV28 |
| Architecture | Kelvin |
| Cards | |
| Entry-level | MX |
| Mid-range | Ti 4200, Ti 4400, Ti 4800 SE |
| High-end | Ti 4600, Ti 4800 |
| API support | |
| Direct3D | Direct3D 7.0 NV1x Direct3D 8.0a NV2x Vertex Shader 1.1 Pixel Shader 1.3 |
| OpenGL | OpenGL 1.3 |
| History | |
| Predecessor | GeForce 3 series |
| Successor | GeForce FX series |
| Support status | |
| Unsupported | |
The GeForce 4 series (codenames below) refers to the fourth generation of Nvidia's GeForce line of graphics processing units (GPUs). There are two different GeForce4 families, the high-performance Ti family (NV25), and the budget MX family (NV17). The MX family spawned a mostly identical GeForce4 Go (NV17M) family for the laptop market. All three families were announced in early 2002; members within each family were differentiated by core and memory clock speeds. In late 2002, there was an attempt to form a fourth family, also for the laptop market, the only member of it being the GeForce4 4200 Go (NV28M) which was derived from the Ti line.
GeForce4 Ti
[edit]


Architecture
[edit]The GeForce4 Ti (NV25) was launched in February 2002[1] and was a revision of the GeForce 3 (NV20). It was very similar to its predecessor; the main differences were higher core and memory clock rates, a revised memory controller (known as Lightspeed Memory Architecture II/LMA II), Direct3D 8.1 support with up to Pixel Shader 1.3,[2][3][4] an additional vertex shader (the vertex and pixel shaders were now known as nFinite FX Engine II), hardware anti-aliasing (Accuview AA), and DVD playback.[1] Legacy Direct3D 7-class fixed-function T&L was now implemented as vertex shaders.[3] Proper dual-monitor support (TwinView) was also brought over from the GeForce 2 MX.[5] The GeForce 4 Ti was superior to the GeForce 4 MX in virtually every aspect save for higher production cost, although the MX had the Nvidia VPE (video processing engine) which the Ti lacked.
Lineup
[edit]The initial two models were the Ti4400 (US$299) and the top-of-the-range Ti4600 (US$399). At the time of their introduction, Nvidia's main products were the entry-level GeForce 2 MX, the midrange GeForce4 MX models (released the same time as the Ti4400 and Ti4600), and the older but still high-performance GeForce 3 (demoted to the upper mid-range or performance niche).[1] However, ATI's Radeon 8500LE (9100) (a slower-clocked version of the flagship Radeon 8500) was somewhat cheaper than the Ti4400, and outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. The GeForce 3 Ti500 (US$299 prior to the GeForce4 release) was rendered obsolete quickly as it could not be produced cheap enough to justify a further price drop, though it filled the performance gap between the GeForce 3 Ti200 and the GeForce4 Ti4400.[6]
In consequence, Nvidia rolled out a slightly cheaper model: the Ti4200. Although the 4200 was initially supposed to be part of the launch of the GeForce4 line, Nvidia had delayed its release to sell off the soon-to-be discontinued GeForce 3 Ti500 chips.[7] In an attempt to prevent the Ti4200 damaging the Ti4400's sales, Nvidia set the Ti4200's memory speed at 222 MHz on the models with a 128 MiB frame buffer—a full 53 MHz slower than the Ti4400 (all of which had 128 MiB frame buffers). Models with a 64 MiB frame buffer were set to 250 MHz memory speed. This tactic didn't work however, for two reasons. Firstly, the Ti4400 was perceived as being not good enough for those who wanted top performance (who preferred the Ti4600), nor those who wanted good value for money (who typically chose the Ti4200), causing the Ti4400 to fade into obscurity.[8] Furthermore, some graphics card makers simply ignored Nvidia's guidelines for the Ti4200, and set the memory speed at 250 MHz on the 128 MiB models anyway.[9]
Then in late 2002, the NV25 core was replaced by the NV28 core, which differed only by addition of AGP-8X support. The Ti4200 with AGP-8X support was based on this chip, and sold as the Ti4200-8X. A Ti4800SE replaced the Ti4400 and a Ti4800 replaced the Ti4600 respectively when the 8X AGP NV28 core was introduced on these.[10][11]
The only mobile derivative of the Ti series was the GeForce4 4200 Go (NV28M), launched in late 2002.[12] The solution featured the same feature-set and similar performance compared to the NV28-based Ti4200, although the mobile variant was clocked lower. It outperformed the Mobility Radeon 9000 by a large margin, as well as being Nvidia's first DirectX 8 laptop graphics solution. However, because the GPU was not designed for the mobile space, it had thermal output similar to the desktop part. The 4200 Go also lacked power-saving circuitry like the MX-based GeForce4 4x0 Go series or the RV250-based Mobility Radeon 9000. This caused problems for notebook manufacturers, especially with regards to battery life.[13]
Performance
[edit]The GeForce4 Ti outperformed the older GeForce 3 by a significant margin.[1] The competing ATI Radeon 8500 was generally faster than the GeForce 3 line, but was overshadowed by the GeForce 4 Ti in every area other than price and more advanced pixel shader (1.4) support.[1] Nvidia, however, missed a chance to dominate the upper-range/performance segment by delaying the release of the Ti4200 and by not rolling out 128 MiB models quickly enough; otherwise the Ti4200 was cheaper and faster than the previous top-line GeForce 3 Ti500 and Radeon 8500.[14] Besides the late introduction of the Ti4200, the limited release 128 MiB models of the GeForce 3 Ti200 proved unimpressive, letting the Radeon 8500LE and even the full 8500 dominated the upper-range performance for a while.[15] The Matrox Parhelia, despite having several DirectX 9.0 capabilities and other innovative features, was at most competitive with the GeForce 3 and GeForce 4 Ti 4200, but it was priced the same as the Ti 4600 at US$399.
The GeForce 4 Ti4200 enjoyed considerable longevity compared to its higher-clocked peers. Debuting at half the cost of the 4600 (US$199 versus US$399), the 4200 remained the best balance between price and performance until the launch of the short-lived DirectX 9 ATI Radeon 9500 Pro at the end of 2002.[16] The Ti4200 still managed to hold its own against several next generation DirectX 9 compliant GPUs released in late 2003, outperforming the GeForce FX 5200 and the midrange FX 5600, and performing similarly to the mid-range Radeon 9600 Pro (ATI's permanent successor to the Radeon 9500 Pro) in some situations.[17][18][19]
GeForce4 MX
[edit]Architecture
[edit]
Though its lineage was of the past-generation GeForce2 (NV11 and NV15), the GeForce4 MX (NV17) did incorporate bandwidth and fill rate-saving techniques, dual-monitor support, and a multi-sampling anti-aliasing unit from the Ti series (NV25); the improved 128-bit DDR memory controller was crucial to solving the bandwidth limitations that plagued the GeForce 256 (NV10) and GeForce2 lines. It also owed some of its design heritage to Nvidia's high-end CAD products, and in performance-critical non-game applications it was remarkably effective. The most notable example is AutoCAD, in which the GeForce4 MX returned results within a single-digit percentage of GeForce4 Ti cards several times the price.
Many criticized the GeForce4 MX name as a misleading marketing ploy since it was less advanced than the preceding GeForce3 (NV20). In the features comparison chart between the Ti and MX lines, it showed that the only "feature" that was missing on the MX was the nfiniteFX II engine—the DirectX 8 programmable vertex and pixel shaders.[20] However, the GeForce4 MX was not a GeForce4 Ti with the shader hardware removed, as the MX's performance in games that did not use shaders was considerably behind the Ti.
In motion-video applications, the GeForce4 MX offered new functionality. It (and not the GeForce4 Ti) was the first GeForce member to feature the Nvidia VPE (video processing engine). It was also the first GeForce to offer hardware-iDCT and VLC (variable length code) decoding, making VPE a major upgrade from Nvidia's previous HDVP. In the application of MPEG-2 playback, VPE could finally compete head-to-head with ATI's video engine.
Lineup
[edit]
There were 3 initial models: the MX420, the MX440 and the MX460. The MX420 had only Single Data Rate (SDR) memory and was designed for very low end PCs, replacing the GeForce2 MX100 and MX200. The GeForce4 MX440 was a mass-market OEM solution, replacing the GeForce2 MX/MX400 and GeForce2 Ti. The GeForce4 MX460 was initially meant to slot in between the MX440 and the Ti4400, while the release of the Ti4200 was held back.
In terms of 3D performance, the MX420 performed only slightly better than the GeForce2 MX400 and below the GeForce2 GTS. However, this was never really much of a problem, considering its target audience. The nearest thing to a direct competitor the MX420 had was ATI's Radeon 7000. In practice its main competitors were chipset-integrated graphics solutions, such as Intel's 845G and Nvidia's own nForce 2, but its main advantage over those was multiple-monitor support; Intel's solutions did not have this at all, and the nForce 2's multi-monitor support was much inferior to what the MX series offered.

The MX440 performed reasonably well for its intended audience, outperforming its closest competitor, the ATI Radeon 7500, as well as the discontinued GeForce2 Ti. Despite harsh criticism by gaming enthusiasts, the MX440 was successful in the PC OEM market as a replacement for the GeForce2 MX. Priced about 30% above the GeForce2 MX, it provided better performance, the ability to play a number of popular games that the GeForce2 could not run well—above all else—to the average non-specialist it sounded as if it were a "real" GeForce4—i.e., a GeForce4 Ti. While John Carmack initially warned gamers not to purchase the GeForce 4 MX440, its somewhat widespread adoption compelled id Software to make it the only DirectX 7.0 GPU supported by Doom 3.[21][22] When ATI launched its Radeon 9000 Pro in September 2002, it performed about the same as the MX440, but had crucial advantages with better single-texturing performance and proper support of DirectX 8 shaders. However, the 9000 was unable to break the MX440's entrenched hold on the OEM market. Nvidia's eventual answer to the Radeon 9000 was the GeForce FX 5200, but despite the 5200's DirectX 9 features it did not have a significant performance increase compared to the MX440 even in DirectX 7.0 games. This kept the MX440 in production while the 5200 was discontinued.
One of the fastest DirectX 7.0 compliant GPUs, the MX460's performance was similar to the discontinued GeForce 2 Ultra and the existing GeForce3 Ti200 (the remaining available member of the GeForce 3 family). However, ATI released the Radeon 8500LE which outperformed its price competitors, the GeForce 3 Ti200 and GeForce4 MX 460. ATI's move in turn compelled Nvidia to roll out the Ti4200 earlier than planned, also at a similar price to the MX 460, and soon afterwards discontinuing the Ti200. The Ti200, 8500LE, and Ti4200 were all DirectX 8.0 compliant while having similar market pricing to the MX460, while the 8500LE and Ti4200 also provided significantly better performance than the MX460, prevented the MX460 from ever being popular compared to the other GeForce 4 MX releases.
The GeForce4 Go was derived from the MX line and it was announced along with the rest of the GeForce4 Ti and MX lineup in early 2002. There was the 420 Go, 440 Go, and 460 Go. However, ATI had beaten them to the market with the similarly performing Mobility Radeon 7500, and later the DirectX 8.0 compliant Mobility Radeon 9000. (Despite its name, the short-lived 4200 Go (NV28M) is not part of this lineup, it was instead derived from the Ti line.)
Like the Ti series, the MX was also updated in late 2002 to support AGP-8X with the NV18 core. The two new models were the MX440-8X, which was clocked slightly faster than the original MX440, and the MX440SE, which had a narrower memory bus, and was intended as a replacement of sorts for the MX420. The MX460, which had been discontinued by this point, was never replaced. Another variant followed in late 2003—the MX 4000, which was a GeForce4 MX440SE with a slightly higher memory clock.
The GeForce4 MX line received a third and final update in 2004, with the PCX 4300, which was functionally equivalent to the MX 4000 but with support for PCI Express. In spite of its new codename (NV19), the PCX 4300 was in fact simply an NV18 core with a BR02 chip bridging the NV18's native AGP interface with the PCI-Express bus.
GeForce4 model information
[edit]| Model | Launch | Transistors (million)
|
Die size (mm2)
|
Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | Supported API version | TDP (Watts)
| ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
|
|||||||||||||
| GeForce4 MX IGP + nForce2 | October 1, 2002 | NV1F | ? | ? | FSB | 250 | 133 200 |
2:0:4:2 | 500 | 500 | 1,000 | 125 | Up to 128 system RAM | 2.128 6.4 |
DDR | 64 128 |
7.0 | 1.2 | 1.000 | ? |
| GeForce4 MX420 | February 6, 2002 | NV17 | 29[23] | 65 | AGP 4x PCI |
166 | 64 | 2.656 | SDR DDR |
128 (SDR) 64 (DDR) |
14 | |||||||||
| GeForce4 MX440 SE | 2002 | 133 166[24] |
500[24]
1000 |
64 128 |
2.128 5.312[24] | DDR | 64 128[24] | 13 | ||||||||||||
| GeForce MX4000 | December 14, 2003 | NV18B | 29 | 65 | AGP 8x PCI |
166 | 1000 | 2.656 | 64 | 14 | ||||||||||
| GeForce PCX4300 | February 19, 2004 | PCIe x16 | 128 | 16 | ||||||||||||||||
| GeForce4 MX440 | February 6, 2002 | NV17 | 29 | 65 | AGP 4x PCI |
275 | 200 | 550 | 550 | 1,100 | 137.5 | 64 128 |
6.4 | 128 | 1.100 | 18 | ||||
| GeForce4 MX440 8x | September 25, 2002 | NV18 | 29[25] | 65 | AGP 8x PCI |
166 250 |
2.656[26] 8.0 |
64 128 |
19 | |||||||||||
| GeForce4 MX460 | February 6, 2002 | NV17 | 29 | 65 | AGP 4x PCI |
300 | 275 | 600 | 600 | 1,200 | 150 | 8.8 | 128 | 1.200 | 22 | |||||
| GeForce4 Ti4200 | April 16, 2002 | NV25 | 63[27] | 142 | AGP 4x | 250 | 222 (128 MB) 250 (64 MB) |
4:2:8:4 | 1,000 | 1,000 | 2,000 | 125 | 7.104 (128 MB) 8.0 (64 MB) |
8.1 | 1.3 | 15.00 | 33 | |||
| GeForce4 Ti4200 8x | September 25, 2002 | NV28 | 63[28] | 101[29] | AGP 8x | 250 | 8.0 | 34 | ||||||||||||
| GeForce4 Ti4400 | February 6, 2002 | NV25 | 63 | 142 | AGP 4x | 275 | 275 | 1,100 | 1,100 | 2,200 | 137.5 | 128 | 8.8 | 16.50 | 37 | |||||
| GeForce4 Ti4400 8x (Ti4800SE[b]) |
January 20, 2003 | NV28 | 63 | 101 | AGP 8x | 38 | ||||||||||||||
| GeForce4 Ti4600 | February 6, 2002 | NV25 | 63 | 142 | AGP 4x | 300 | 325 | 1,200 | 1,200 | 2,400 | 150 | 10.4 | 18.00 | ? | ||||||
| GeForce4 Ti4600 8x (Ti4800[c]) |
January 20, 2003 | NV28 | 63 | 101 | AGP 8x | 43 | ||||||||||||||
- ^ Pixel shaders: vertex shaders: texture mapping units: render output units
- ^ GeForce4 Ti4400 8x: Card manufacturers utilizing this chip labeled the card as a Ti4800SE. The surface of the chip has "Ti-8x" printed on it.
- ^ GeForce4 Ti4600 8x: Card manufacturers utilizing this chip labeled the card as a Ti4600, and in some cases as a Ti4800. The surface of the chip has "Ti-8x" printed on it, as well as "4800" printed at the bottom.
| Model | Features | |
|---|---|---|
| nFiniteFX II Engine | Video Processing Engine (VPE) | |
| GeForce4 MX420 | No | Yes |
| GeForce4 MX440 SE | No | Yes |
| GeForce4 MX4000 | No | Yes |
| GeForce4 PCX4300 | No | Yes |
| GeForce4 MX440 | No | Yes |
| GeForce4 MX440 8X | No | Yes |
| GeForce4 MX460 | No | Yes |
| GeForce4 Ti4200 | Yes | No |
| GeForce4 Ti4200 8x | Yes | No |
| GeForce4 Ti4400 | Yes | No |
| GeForce4 Ti4400 8x | Yes | No |
| GeForce4 Ti4600 | Yes | No |
| GeForce4 Ti4600 8x | Yes | No |
GeForce4 Go mobile GPU series
[edit]- Mobile GPUs are either soldered to the mainboard or to some Mobile PCI Express Module (MXM).
- All models are made via 150 nm fabrication process
| Model | Launch | Core clock (MHz)
|
Memory clock (MHz)
|
Core config[a]
|
Fillrate | Memory | API support | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
MOperations/s
|
MPixels/s
|
MTexels/s
|
MVertices/s
|
Size (MB)
|
Bandwidth (GB/s)
|
Bus type
|
Bus width (bit)
|
|||||||||
| GeForce4 Go 410 | February 6, 2002 | NV17M | AGP 8x | 200 | 200 | 2:0:4:2 | 400 | 400 | 800 | 0 | 16 | 1.6 | SDR | 64 | 7.0 | 1.2 |
| GeForce4 Go 420 | 400 | 32 | 3.2 | DDR | ||||||||||||
| GeForce4 Go 440 | 220 | 440 | 440 | 440 | 880 | 64 | 7.04 | 128 | ||||||||
| GeForce4 Go 460 | October 14, 2002 | 250 | 500 | 500 | 500 | 1000 | 8 | |||||||||
| GeForce4 Go 488 | NV18M | 300 | 550 | 600 | 600 | 1200 | 8.8 | |||||||||
| GeForce4 Go 4200 | November 14, 2002 | NV28M | 200 | 400 | 4:2:8:4 | 800 | 800 | 1600 | 100 | 6.4 | 8.1 | 1.3 | ||||
GeForce4 Go driver support
[edit]This family is a derivative of the GeForce4 MX family, produced for the laptop market. The GeForce4 Go family, performance wise, can be considered comparable to the MX line.
One possible solution to the lack of driver support for the Go family is the third party Omega Drivers. Using third party drivers can, among other things, invalidate warranties. The Omega Drivers are supported by neither laptop manufacturers, laptop ODMs, nor by Nvidia. Nvidia attempted legal action against a version of Omega Drivers that included the Nvidia logo.[30]
Support
[edit]Nvidia has ceased driver support for GeForce 4 series.
Final drivers
[edit]- Windows 9x & Windows Me: 81.98 released on December 21, 2005; Download;
- Product Support List Windows 95/98/Me – 81.98.
- Driver version 81.98 for Windows 9x/Me was the last driver version ever released by Nvidia for these systems; no new official releases were later made for these systems.
- Windows 2000, 32-bit Windows XP, 64-bit Windows XP & Media Center Edition: 93.71 released on November 2, 2006; Download.
- Windows 2000, 32-bit Windows XP, 64-bit Windows XP & Media Center Edition: 93.81 (beta) released on November 28, 2006; Download.
The drivers for Windows 2000/XP can also be installed on later versions of Windows such as Windows Vista and 7; however, they do not support desktop compositing or the Aero effects of these operating systems.
Note: Despite claims in the documentation that 94.24 (released on May 17, 2006) supports the Geforce 4 series, it does not (94.24 actually supports only GeForce 6 and GeForce 7 series).[31]
- (Products supported list also on this page)
Windows 95/98/Me Driver Archive
Windows XP/2000 Driver Archive
- Linux/BSD/Solaris: 96.43.23 released on September 14, 2012; Download;
See also
[edit]Notes and references
[edit]- ^ a b c d e Lal Shimpi, Anand (February 6, 2002). "Nvidia GeForce4 - NV17 and NV25 Come to Life". AnandTech. Archived from the original on May 3, 2010. Retrieved June 14, 2008.
- ^ "NVIDIA GeForce4 Ti 4400 and GeForce4 Ti 4600 (NV25) Review". iXBT Labs. Retrieved June 3, 2025.
- ^ a b "A look at NVIDIA's GeForce4 chips". The Tech Report. February 6, 2002. Archived from the original on March 29, 2019. Retrieved February 19, 2017.
- ^ "ActiveWin.Com: NVIDIA GeForce 4 Ti 4600 - Review". www.activewin.com.
- ^ Worobyew, Andrew.; Medvedev, Alexander. "Nvidia GeForce4 Ti 4400 and GeForce4 Ti 4600 (NV25) Review". Pricenfees. Archived from the original on October 12, 2018. Retrieved May 15, 2007.
- ^ Thomas Pabst (February 6, 2002). "PC Graphics Beyond XBOX - NVIDIA Introduces GeForce4". Tom's Hardware. Retrieved February 13, 2025.
- ^ "Gaming Laptops 2017 in India". www.lastlaptop.com. Archived from the original on January 3, 2017. Retrieved January 2, 2017.
- ^ Shimpi, Anand Lal. "NVIDIA GeForce4 Ti 4400/4600 Roundup - April 2002". www.anandtech.com. Archived from the original on July 4, 2010. Retrieved February 13, 2025.
- ^ Shimpi, Anand Lal. "NVIDIA GeForce4 Ti 4200 Roundup - June 2002". www.anandtech.com. Archived from the original on June 3, 2024. Retrieved February 13, 2025.
- ^ Connolly, Chris. The GeForce4’s Last Gasp : MSI’s GeForce4 Ti4800 / Ti4600-8X, GamePC, January 20, 2003.
- ^ R., Jason. MSI GeForce4 Ti4800SE 8X VIVO Video Card, Extreme Overclocking, March 30, 2003.
- ^ "GeForce4 Go". Nvidia.com. Retrieved May 15, 2007.
- ^ Witheiler, Matthew (November 14, 2002). "Nvidia GeForce4 4200 Go: Bringing mobile gaming to new heights". AnandTech. Archived from the original on July 8, 2004. Retrieved June 14, 2008.
- ^ Shimpi, Anand Lal. "Sub-$200 Video Card Roundup - April 2002". www.anandtech.com. Archived from the original on July 2, 2017. Retrieved February 13, 2025.
- ^ Freeman, Vince. Nvidia GeForce4 Ti 4200 Review Archived June 24, 2003, at the Wayback Machine, Sharky Extreme, April 26, 2002.
- ^ Wasson, Scott. ATI's Radeon 9500 Pro graphics card: DirectX 9 goes mainstream Archived July 6, 2007, at the Wayback Machine, Tech Report, November 27, 2002.
- ^ Gasior, Geoff. ATI's Radeon 9600 Pro GPU: One step forward, two steps back? Archived November 27, 2006, at the Wayback Machine, Tech Report, April 16, 2003.
- ^ Gasior, Geoff. Nvidia's GeForce FX 5200 GPU: Between capability and competence Archived August 9, 2007, at the Wayback Machine, Tech Report, April 29, 2003.
- ^ "GeForce FX 5200 Reviewed - Slashdot". April 30, 2003.
- ^ "GeForce4 MX". Nvidia.com. Retrieved June 14, 2008.
- ^ "John Carmack: Don't Buy a GeForce4-MX for Doom 3 – OSnews". Retrieved February 13, 2025.
- ^ https://gamespot.com/gamespot/stories/news/0,10870,2847063,00.html
- ^ "NVIDIA GeForce4 MX 420 PCI Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ a b c d Archived copy Archived 2023-07-11 at the Wayback Machine
- ^ "NVIDIA GeForce4 MX 440-8x Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "mx4408x-64bit166-Mhz". ImgBB. Archived from the original on December 25, 2022. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce4 Ti 4200 Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "NVIDIA GeForce4 Ti 4200-8X Specs". TechPowerUp. Retrieved August 30, 2024.
- ^ "GeForce 4 Ti 4200-8X gpu-z screen (Triplex Millennium)". VC Collection (in Russian). Retrieved October 6, 2025.
- ^ "Der Fall Omega vs. Nvidia". WCM - Das österreichische Computer Magazin (in German). WCM. July 24, 2003. Retrieved April 12, 2007.
"The case Omega vs. Nvidia (English translation)". WCM - Austrian Computers the Magazine. WCM. July 24, 2003. Retrieved April 12, 2007. - ^ "Driver Details". NVIDIA.
External links
[edit]GeForce 4 series
View on GrokipediaOverview
Development and Release
The high-end GeForce 4 Ti series was developed under NVIDIA's Kelvin microarchitecture codename (NV25), representing a direct revision and refinement of the GeForce 3's underlying Celsius design, while the MX models utilized a budget-oriented derivative of the Celsius architecture (NV17/NV18), allowing NVIDIA to target both premium enthusiasts and mainstream consumers with a unified family approach.[2][1] NVIDIA announced the GeForce 4 series on February 6, 2002, positioning it as the direct successor to the GeForce 3 released the previous year, with initial availability for Ti and MX cards rolling out immediately thereafter.[1] Concurrently, the mobile GeForce4 Go series launched to extend the lineup to laptops, featuring models like the 420 Go and 440 Go derived from the MX architecture.[5] Later in the year, on November 14, 2002, NVIDIA added the GeForce4 4200 Go as an upgraded mobile option based on the NV28 processor.[6] However, the MX series faced criticism for its limited performance gains over previous generations and absence of advanced shader support, leading to accusations of misleading branding. At launch, suggested retail pricing reflected the tiered strategy: the flagship GeForce4 Ti 4600 at $399, the GeForce4 Ti 4400 at $299, the GeForce4 Ti 4200 at $199 (released in April 2002), and the entry-level GeForce4 MX 440 at approximately $149.[1] Production of the series wound down around 2004 as NVIDIA shifted focus to the succeeding GeForce FX lineup, though rebranded variants like the GeForce4 Ti 4800 SE—essentially an overclocked NV28-based refresh of the Ti 4800—were introduced in early 2003 to extend market presence.Key Features and Innovations
The GeForce 4 series represented a significant step forward in NVIDIA's graphics technology, introducing enhancements in shader programmability, antialiasing, memory management, and video processing that improved both gaming performance and multimedia functionality. Built on the Kelvin architecture for Ti models and a Celsius derivative for MX, the series supported key industry standards, with the high-end Ti variants offering full compatibility with DirectX 8.1, including Pixel Shader 1.3 for advanced per-pixel effects like bump mapping and Vertex Shader 1.1 for geometry transformations, alongside OpenGL 1.3 for cross-platform rendering.[7][8] Central to the Ti models was the nFiniteFX Engine II, which delivered up to three times the vertex processing power of the prior generation through dual programmable vertex shaders and optimized pixel shader pipelines, enabling complex effects such as procedural deformations, real-time fur shading, and fog simulation. Complementing this, Intellisample 2.0 provided advanced multisampling antialiasing with modes like 4x and Quincunx under the Accuview technology, achieving 2-3 times better performance in antialiased scenes compared to competitors while minimizing frame rate impacts. The Lightspeed Memory Architecture II (LMA II) further boosted efficiency with four independent 32-bit memory controllers, quad caching, and 4:1 lossless Z-compression for texture and depth data, resulting in up to three times faster memory access in demanding scenarios. TwinView technology supported seamless dual-monitor setups, allowing independent analog, digital, or TV outputs for enhanced productivity and immersion.[8][1] In the budget-oriented MX variants, the Nvidia Video Processing Engine (VPE) introduced hardware-accelerated iDCT (inverse discrete cosine transform) and VLC (variable length code) decoding for MPEG-2 video, marking a major upgrade in media handling with features like motion compensation, adaptive de-interlacing, and 5x3-tap filtering for smooth playback and scaling. These cards also included integrated TV encoders for high-quality component output, a first for NVIDIA GPUs, which helped bridge high-performance gaming with consumer multimedia applications like DVD and HDTV viewing.[9]Architecture
Core Design and Kelvin Architecture
The GeForce 4 series employs two distinct GPU microarchitectures: the Kelvin architecture for high-end Ti variants and a derivative of the Celsius architecture for budget MX variants. Kelvin, NVIDIA's fourth-generation microarchitecture and successor to the third-generation Celsius, powers the NV25 chip in Ti models and was fabricated by TSMC using a 0.15 μm process. The NV25 core integrates 63 million transistors across a die size of 142 mm². By comparison, the NV17 core in the MX variants, based on a cost-reduced Celsius design, contains 29 million transistors on a smaller 65 mm² die, also at 0.15 μm, reflecting its streamlined design for cost-sensitive applications.[10][11] Subsequent revisions such as the NV28 (for later Ti models like the Ti 4800) remained on the 0.15 μm process but enabled higher clock speeds through design optimizations.[12] At its foundation, the Kelvin architecture utilizes a quad-pipe rendering setup in the Ti series, equipped with 4 pixel pipelines and 2 programmable vertex engines to handle geometry processing and rasterization efficiently. The MX series, however, employs a reduced configuration with 2 pixel pipelines and omits programmable vertex shaders, prioritizing compatibility with legacy DirectX 7 features while sacrificing advanced programmable effects. This hardware layout supports the nFiniteFX Engine II for enhanced lighting and texturing capabilities in Ti models.[2][11][13][14] Memory subsystems in the Ti models feature a unified 128-bit DDR controller, delivering bandwidth improvements over the GeForce 2's SDR implementation through better prefetching and tiling mechanisms. MX variants offer flexible 64/128-bit DDR controllers, allowing manufacturers to balance performance and cost by configuring narrower buses in entry-level configurations like the MX 440 SE.[2][4][14] The Ti series, exemplified by the GeForce4 Ti 4600, operates with typical power consumption of around 50 W, drawn entirely from the motherboard via the standard AGP 4x interface without auxiliary connectors, enabling straightforward integration into contemporary PC systems.[15][16]Graphics Pipeline and Shaders
The GeForce 4 Ti series featured four parallel rendering pipelines, enabling efficient processing of graphics data through the Kelvin architecture's programmable shading capabilities.[17] These pipelines supported Pixel Shader 1.3, allowing up to four instructions per pixel for advanced effects such as per-pixel lighting and bump mapping, and Vertex Shader 1.1 for geometric transformations and skinning of up to two bones per vertex.[18] This setup marked a significant advancement in programmable rendering, facilitating DirectX 8.1 compliance and improved scene complexity handling compared to prior generations. In contrast, the GeForce 4 MX series lacked programmable shaders entirely, relying on a fixed-function pipeline compliant with DirectX 7.0 standards for basic transform and lighting operations.[19] This limitation restricted the MX variants to simpler rendering tasks without the extensibility for custom effects, though software emulation could approximate some Vertex Shader 1.1 functionality in compatible applications.[20] Both the Ti and MX series incorporated four texture units to support bilinear filtering and multi-texturing operations, including bump mapping in the Ti models for enhanced surface detail.[18] However, the MX units operated without shader extensibility, capping their utility at fixed-function texture combinations.[21] For antialiasing and texture filtering, the Ti series employed Intellisample 2.0, an adaptive technique that combined multisampling with transparency supersampling to reduce edge aliasing while preserving performance. The MX series, meanwhile, supported only basic multisampling antialiasing without advanced adaptive features.[22] Conceptually, shader throughput in the Ti series could be estimated by the pixel fill rate, calculated as the core clock frequency multiplied by the number of pipelines; for the GeForce 4 Ti 4600 operating at 300 MHz, this yielded 1.2 Gpixels/s (300 \times 10^6 \times 4 / 10^9).[18] This metric highlighted the architecture's capacity for high-volume pixel processing in shader-intensive workloads.[23]High-End Variants
GeForce4 Ti Series Lineup
The GeForce4 Ti series represented NVIDIA's high-end desktop graphics offerings within the GeForce 4 lineup, built primarily on the NV25 graphics processor with later models using the NV28 variant for improved manufacturing yields and AGP 8x compatibility, including Ti 4200 8x configurations.[10][12][24] These cards targeted gamers and professionals seeking enhanced performance over the standard GeForce4 models, featuring 128-bit memory interfaces and support for DirectX 8.1 and OpenGL 1.3.[25][23] The series launched with the Ti 4600 and Ti 4400 models on February 6, 2002, followed by the Ti 4200 in April 2002, and concluded with the limited-release Ti 4800 in March 2003 and its OEM Ti 4800 SE variant in February 2003.[2][26][7][27][28] All models utilized DDR memory and delivered theoretical pixel fill rates up to 1.2 Gpixels/s and texture fill rates up to 2.4 Gtexels/s, depending on the clock speeds.[2]| Model | Release Date | Core Clock | Memory Clock | Memory Configuration | Interface | GPU | Notes |
|---|---|---|---|---|---|---|---|
| GeForce4 Ti 4600 | February 6, 2002 | 300 MHz | 325 MHz | 128 MB DDR, 128-bit | AGP 4x | NV25 | Flagship model with highest clocks in initial lineup.[2] |
| GeForce4 Ti 4400 | February 6, 2002 | 275 MHz | 275 MHz | 128 MB DDR, 128-bit | AGP 4x | NV25 | Mid-tier option balancing cost and performance.[26] |
| GeForce4 Ti 4200 | April 16, 2002 | 250 MHz | 222 MHz | Up to 128 MB DDR, 128-bit | AGP 4x | NV25 | Entry-level Ti variant, with some configurations supporting 64 MB. MSRP $199 (announced February 2002).[7][1] |
| GeForce4 Ti 4800 | March 15, 2003 | 300 MHz | 325 MHz | 128 MB DDR, 128-bit | AGP 8x | NV28 | Limited retail release aimed at competing with emerging rivals.[27] |
| GeForce4 Ti 4800 SE | February 16, 2003 | 275 MHz | 275 MHz | 128 MB DDR, 128-bit | AGP 8x | NV28 | OEM-exclusive variant with reduced clocks for system integration.[28] |
| </section_text> |
GeForce4 Ti Performance and Benchmarks
The GeForce4 Ti series marked a substantial advancement over the GeForce3 in gaming performance, particularly in DirectX 8-era titles, with the Ti 4600 capable of up to 20-50% higher frame rates compared to the GeForce3 Ti 500.[29] This edge was most evident in shader-intensive scenarios, where the Ti series' dual texture rendering pipelines and improved fill rates provided smoother gameplay without requiring extensive CPU intervention.[30] In competition with contemporaries like the ATI Radeon 8500, the GeForce4 Ti held its own, often outperforming it by a slim margin in key tests. For instance, the Ti 4600 achieved roughly 15% higher frame rates than the Radeon 8500 in Quake III Arena, especially in scenes leveraging programmable shaders, thanks to NVIDIA's optimized DirectX 8 implementation and the 128-bit memory bus that sustained data throughput for complex effects.[31] Synthetic evaluations further underscored this capability; the Ti 4600 typically scored around 10,000 points in 3DMark 2001, establishing it as a benchmark leader for high-end DirectX 8 workloads at the time.[32] Under full load, power consumption hovered near 50W, balancing performance with reasonable efficiency for mid-2000s systems.[33] The Ti lineup, particularly the Ti 4200, exhibited impressive longevity into the DirectX 9 era, remaining suitable for mid-range gaming up to 2005 in titles like Half-Life 2 at moderate resolutions and settings. However, limitations emerged in demanding configurations; the series faltered with high-resolution anti-aliasing (e.g., 4x AA at 1600x1200), where the 128-bit bus constrained memory bandwidth, leading to frame rate drops of up to 50% compared to lower-res tests and highlighting a key bottleneck against emerging 256-bit competitors.[34]Budget Variants
GeForce4 MX Series Lineup
The GeForce4 MX series was introduced as NVIDIA's budget-oriented graphics solution within the GeForce4 lineup, targeting entry-level desktop systems with cost-effective performance for basic 2D and 3D applications. Launched on February 6, 2002, the series utilized the NV17 graphics processor fabricated on a 150 nm process with 29 million transistors, featuring a fixed-function pipeline compliant with DirectX 7.0 but lacking programmable shaders.[4][1][11] The MX lineup included three primary models: the MX 420, MX 440, and the rarer MX 460, all supporting AGP 4x interfaces and offering memory configurations optimized for affordability. A later refresh, the MX 4000, was released in December 2003 based on the NV18 chip, featuring similar specifications but with options for 128 MB memory and a 64-bit bus in some implementations. Variants emerged in 2002, such as the MX 440-8x based on the NV18 chip revision, which added AGP 8x compatibility while maintaining similar core specifications.[3][35]| Model | Release Date | Core Clock | Memory Clock | Memory Size | Memory Type | Bus Width | Fill Rate (Pixel) | Chip |
|---|---|---|---|---|---|---|---|---|
| MX 420 | Feb 6, 2002 | 250 MHz | 166 MHz | 64 MB | SDR | 128-bit | 0.50 Gpixels/s | NV17 |
| MX 440 | Feb 6, 2002 | 275 MHz | 200 MHz | 64 MB | DDR | 128-bit | 0.55 Gpixels/s | NV17 |
| MX 460 | Feb 6, 2002 | 300 MHz | 275 MHz | 64 MB | DDR | 128-bit | 0.60 Gpixels/s | NV17 |
| MX 4000 | Dec 14, 2003 | 250 MHz | 166 MHz | 64/128 MB | DDR/SDR | 128/64-bit | 0.50 Gpixels/s | NV18 |
GeForce4 MX Performance and Applications
The GeForce4 MX series provided sufficient performance for entry-level gaming, handling DirectX 7 and 8 titles adequately at low to medium settings on resolutions such as 800x600. For instance, the MX 440 achieved playable frame rates in older games like Half-Life, making it a viable option for budget gamers during its era.[39] Despite these capabilities, the series had notable limitations due to the absence of programmable vertex and pixel shaders, which caused visual artifacts and suboptimal rendering in DirectX 8+ applications. In gaming comparisons, the MX 440 was approximately 50% slower than the higher-end GeForce4 Ti 4200 overall, and significantly more in shader-dependent scenarios.[29] Benchmark results underscored its mid-range positioning, with the MX 440 scoring approximately 4,500 points in 3DMark 2001 under typical configurations. It also excelled in 2D acceleration, benefiting office productivity tasks by enabling smooth high-resolution desktop rendering and window management.[40][41][42] Beyond gaming, the GeForce4 MX shone in multimedia and professional applications, outperforming contemporary integrated graphics solutions. In CAD software like AutoCAD 2002, it delivered reliable 2D performance for design workflows, leveraging its robust rendering engine. The integrated Video Processing Engine (VPE) further enhanced video handling, supporting full hardware MPEG-2 decoding—including inverse discrete cosine transform (iDCT) and variable-length coding (VLC)—for efficient playback in tools like PowerDVD, with minimal CPU overhead. The 128-bit memory controller contributed to these multimedia strengths by providing balanced bandwidth for video processing tasks.[43][9][44][45]Mobile Graphics
GeForce4 Go Series
The GeForce4 Go series represented NVIDIA's initial foray into mobile graphics processing units (GPUs) under the Kelvin architecture umbrella, tailored for laptop integration with reduced power profiles and compact form factors compared to desktop counterparts. Announced in early 2002, these GPUs derived from the desktop GeForce4 MX and Ti architectures, incorporating features like integrated transform and lighting (T&L) engines while prioritizing portability. The lineup targeted entry-level to mid-range mobile gaming and multimedia applications, supporting DirectX 8.0 in MX-based models and DirectX 8.1 in the Ti-based 4200 Go variant.[1] The series debuted with the entry-level GeForce4 420 Go and 440 Go on February 6, 2002, both utilizing the NV17M chip fabricated on a 150 nm process. The 420 Go operated at a 200 MHz core clock and 200 MHz memory clock (400 MHz effective) with 32 MB of DDR memory across a 64-bit bus, suitable for basic 3D acceleration in thin-and-light notebooks. In contrast, the 440 Go offered improved performance with a 220 MHz core clock and 220 MHz memory clock (440 MHz effective), paired with 64 MB DDR on a wider 128-bit bus, enabling better handling of textures and resolutions for mobile productivity and light gaming. Later releases expanded capabilities with the GeForce4 460 Go, launched on October 14, 2002, still based on the NV17M chip. It featured a higher 250 MHz core clock and 250 MHz memory clock (500 MHz effective) with 64 MB DDR on a 128-bit bus, providing enhanced fill rates for more demanding portable workloads. The series culminated in the GeForce4 4200 Go on November 14, 2002, shifting to the NV28M chip for Ti-level features including 4 pixel shaders and 2 vertex shaders under DirectX 8.1 support. This model ran at a 200 MHz core clock but with a 200 MHz memory clock (400 MHz effective), 64 MB DDR, and 64-bit bus, balancing efficiency and performance in premium laptops.[6] These GPUs were typically implemented in either soldered configurations directly onto the motherboard for cost-effective designs or as early Mobile PCI Express Module (MXM) cards for easier upgrades in select high-end systems.| Model | Release Date | Core Clock | Memory Clock (Effective) | Memory Size/Type | Bus Width | Chip |
|---|---|---|---|---|---|---|
| Go 420 | Feb 6, 2002 | 200 MHz | 200 MHz (400 MHz) | 32 MB DDR | 64-bit | NV17M |
| Go 440 | Feb 6, 2002 | 220 MHz | 220 MHz (440 MHz) | 64 MB DDR | 128-bit | NV17M |
| Go 460 | Oct 14, 2002 | 250 MHz | 250 MHz (500 MHz) | 64 MB DDR | 128-bit | NV17M |
| Go 4200 | Nov 14, 2002 | 200 MHz | 200 MHz (400 MHz) | 64 MB DDR | 64-bit | NV28M |

