Hubbry Logo
GeForce GTX 16 seriesGeForce GTX 16 seriesMain
Open search
GeForce GTX 16 series
Community hub
GeForce GTX 16 series
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
GeForce GTX 16 series
from Wikipedia

GeForce 16 series
Top: Logo of the series
Bottom: A Nvidia GeForce GTX 1660 ASUS TUF Gaming X3 released in 2019, a higher-end model of the series
Release dateFebruary 22, 2019; 6 years ago (2019-02-22)
DiscontinuedMarch 5, 2024; 19 months ago (2024-03-05)[1]
Manufactured byTSMC
Designed byNvidia
Marketed byNvidia
CodenameTU11x
ArchitectureTuring
ModelsGeForce GTX series
Transistors
  • 4.7B (TU117)
  • 6.6B (TU116)
  • 10.8B (TU106)
Fabrication processTSMC 12 nm (FinFET)
Cards
Entry-level
  • GeForce GTX 1630
  • GeForce GTX 1650
  • GeForce GTX 1650 Super
  • GeForce GTX 1650 Ti (laptop only)
Mid-range
  • GeForce GTX 1660
  • GeForce GTX 1660 Super
  • GeForce GTX 1660 Ti
API support
Direct3DDirect3D 12.0 (feature level 12_1)
Shader Model 6.7
OpenCLOpenCL 3.0[2][a]
OpenGLOpenGL 4.6[3]
Vulkan
History
PredecessorGeForce 10 series
VariantGeForce RTX 20 series
SuccessorGeForce RTX 30 series
Support status
Supported

The GeForce GTX 16 series is a series of graphics processing units (GPUs) developed by Nvidia, based on the Turing microarchitecture, announced in February 2019.[5] The GeForce GTX 16 series, commercialized within the same timeframe as the GeForce RTX 20 series, aims to cover the entry-level to mid-range market, not addressed by the latter. As a result, the media have mainly compared it to AMD's Radeon RX 500 series of GPUs.

The GeForce GTX 16 series includes the GTX 1650, 1650 Super, 1660, 1660 Super, 1660 Ti, and a lower-end GTX 1630, which was released later. The GTX 1650 features both a GDDR5 and GDDR6 version.

Like the GeForce RTX 20 series, the GeForce GTX 16 series was followed by the GeForce 30 series. The 16 series was the last GPU generation of the GeForce series that did not support hardware-accelerated real-time ray tracing and was therefore marketed under the GTX instead of RTX prefix.

Architecture

[edit]

The GeForce GTX 16 series is based on the same Turing architecture used in the GeForce RTX 20 series, omitting the Tensor (AI) and RT (ray tracing) cores exclusive to the 20 series. The 16 series does, however, retain the dedicated integer cores used for concurrent execution of integer and floating point operations.[6] On March 18, 2019, Nvidia announced that via a driver update in April 2019 they would enable DirectX Raytracing on 16 series cards starting with GTX 1660, together with certain cards in the 10 series, a feature reserved to the RTX series up to that point.[7]

Products

[edit]

The GeForce GTX 16 series launched on February 23, 2019, with the announcement of the GeForce GTX 1660 Ti.[8] The cards have a PCIe 3.0 x16 interface, which connects them to the CPU, produced with TSMC's 12 nm FinFET process. On April 22, 2019, coinciding with the announcement of the GTX 1650, Nvidia announced laptops equipped with built-in GTX 1650 chipsets.[9] TU117 does not support Nvidia Optical Flow,[10] which is useful for motion interpolation software. All TU117 GPUs use Volta's NVENC encoder (which has a marginally better improvement over Pascal) instead of Turing's.[11]

Desktop

[edit]
Model Launch Launch
price
(USD)
Code
name
(s)
Transistors (billion)
Die size (mm2)
Core
config[b]
L2
cache

(MB)
Clock speeds[c] Fillrate[d][e] Memory Processing power (GFLOPS) TDP
Core
(MHz)
Memory
(GT/s)
Pixel
(Gpx/s)
Texture
(Gtex/s)
Size
(GB)
Band-
width
(GB/s)
Type Bus
width
(bit)
Half
precision

(boost)
Single
precision

(boost)
Double
precision

(boost)
GeForce GTX
1630[12][13]
Jun 28, 2022 ? TU117-150 4.7 200 512:32:16
8 SM
1 1740
(1785)
12.0 27.84
(28.56)
55.68
(57.12)
4 96 GDDR6 64 3564
(3656)
1782
(1828)
55.68
(57.12)
75 W
GeForce GTX
1650[11][14][15][16]
Apr 23, 2019 149[17] TU117-300 896:56:32
14 SM
1485
(1665)
8.0 47.52
(53.28)
83.16
(93.24)
128 GDDR5 128 5322
(5967)
2661
(2984)
83.16
(93.24)
Apr 3, 2020[18] 149 1410
(1590)
12.0 45.12
(50.88)
78.96
(89.04)
192 GDDR6 5053
(5699)
2527
(2849)
78.96
(89.04)
Jul 1, 2020[19] TU116-150[20] 6.6 284 80 W
Jun 29, 2020[21] ? TU106-125[22] 10.8 445 90 W
GeForce GTX
1650 Super[23][24]
Nov 22, 2019 159 TU116-250 6.6 284 1280:80:32
20 SM
1530
(1725)
48.96
(55.20)
122.4
(138.0)
7834
(8832)
3917
(4416)
122.4
(138.0)
100 W
GeForce GTX
1660[8][25]
Mar 14, 2019 219 TU116-300 1408:88:48
22 SM
1.5 1530
(1785)
8.0 73.44
(85.68)
134.6
(157.1)
6 GDDR5 192 8617
(10053)
4309
(5027)
134.6
(157.1)
120 W
GeForce GTX
1660 Super[26][27]
Oct 29, 2019 229 14.0 336 GDDR6 125 W
GeForce GTX
1660 Ti[8][28]
Feb 22, 2019 279 TU116-400 1536:96:48
24 SM
1500
(1770)
12.0 72.00
(84.96)
144.0
(169.9)
288 9216
(10875)
4608
(5437)
144.0
(169.9)
120 W

Laptop

[edit]
Model Launch Code
name
(s)
Transistors (billion)
Die size (mm2)
Core
config[b]
L2
cache

(MB)
Clock speeds[c] Fillrate[d][e] Memory Processing power (GFLOPS) TDP
Core
(MHz)
Memory
(GT/s)
Pixel
(Gpx/s)
Texture
(Gtex/s)
Size
(GB)
Band-
width
(GB/s)
Type Bus
width
(bit)
Half
precision

(boost)
Single
precision

(boost)
Double
precision

(boost)
GeForce GTX 1650 Max-Q
(Laptop)[29][30][31][32]
Apr 23, 2019 TU117-750[33]
(N18P-G0-MP)
4.7 200 1024:64:32
16 SM
1 1020
(1245)
8.0 32.64
(39.84)
65.28
(79.68)
4 128 GDDR5 128 4178
(5100)
2089
(2550)
65.28
(79.68)
35 W
Apr 15, 2020[34] TU117-753[35]
(N18P-G61-MP2)
930
(1125)
12.0 29.76
(36.00)
59.52
(72.00)
192 GDDR6 3809
(4608)
1905
(2304)
59.52
(72.00)
30 W
GeForce GTX 1650
(Laptop)[29][30][36][37]
Apr 23, 2019 TU117-750[33]
(N18P-G0-MP)
1395
(1560)
8.0 44.64
(49.92)
89.28
(99.84)
128 GDDR5 5714
(6390)
2857
(3195)
89.28
(99.84)
50 W
Apr 15, 2020[38] TU117-753[35]
(N18P-G61-MP2)
1380
(1515)
12.0 44.16
(48.48)
88.32
(96.96)
192 GDDR6 5653
(6205)
2826
(3103)
88.32
(96.96)
GeForce GTX 1650 Ti Max-Q
(Laptop)[30][39][40]
Apr 2, 2020 TU117-775[35]
(N18P-G62)
1035
(1200)
33.12
(38.40)
66.24
(76.80)
4239
(4915)
2120
(2458)
66.24
(76.80)
35 W
GeForce GTX 1650 Ti
(Laptop)[30][41][42]
1350
(1485)
43.20
(47.52)
86.40
(95.04)
5530
(6083)
2765
(3041)
86.40
(95.04)
55 W
GeForce GTX 1660 Ti Max-Q
(Laptop)[30][43][44]
Apr 23, 2019 TU116-750[33]
(N18E-G0)
6.6 284 1536:96:48
24 SM
1.5 1140
(1335)
54.72
(64.08)
109.4
(128.2)
6 288 192 7004
(8202)
3502
(4101)
109.4
(128.2)
60 W
GeForce GTX 1660 Ti
(Laptop)[30][45][46]
1455
(1590)
69.84
(76.32)
139.7
(152.6)
8940
(9769)
4470
(4884)
139.7
(152.6)
80 W
  1. ^ In OpenCL 3.0, OpenCL 1.2 functionality has become a mandatory baseline, while all OpenCL 2.x and OpenCL 3.0 features were made optional.
  2. ^ a b Shader processors: Texture mapping units: Render output units and Streaming multiprocessors (SM)
  3. ^ a b Core boost values (if available) are stated below the base value inside brackets.
  4. ^ a b Pixel fillrate is calculated as the number of ROPs multiplied by the base (or boost) core clock speed.
  5. ^ a b Texture fillrate is calculated as the number of TMUs multiplied by the base (or boost) core clock speed.

Reception

[edit]

GTX 1630

[edit]

Following its June 2022 release, the GTX 1630 received negative reviews. A two-star review by Jarred Walton in Tom's Hardware criticised its price, especially considering that it was released 3 years after most of the 16 series, while only performing better than the GTX 1050. Walton wrote that "the only real contender for the GTX 1630 is AMD's recently launched Radeon RX 6400, but this new Nvidia card actually makes the lackluster 6400 look good". He concluded that "no one should give the GTX 1630 the time of day" and instead recommended the GTX 1650 and 1650 Super for their dramatically improved specs and performance for a similar price.[47] TechSpot characterized the GTX 1630's specs as a "cutdown version of the GTX 1650" from April 2019 which is "3-year-old silicon that to put it mildly was pretty pathetic back then".[48] Jacob Roach of Digital Trends noted that the GTX 1630 released in 2022 was outperformed by AMD's Radeon RX 470 from 2016.[49]

GTX 1650

[edit]

Tom's Hardware criticized the GTX 1650, noting that the 8GB variant of the RX 570 is "faster, less expensive and better able to handle games with big memory requirements."[50]

Forbes described the GTX 1650 as "a tempting choice for a super-small and power-efficient gaming PC".[51] However, "when it comes to value and bang for your buck, AMD's RX 570 is the clear choice overall."[51]

GTX 1650 Super

[edit]

PC Gamer described the GTX 1650 Super as a "no-brainer" and a "super easy recommendation" over the original 1650, as it only costs $10 more and is "consistently around 30 percent faster".[52] However, they criticized the small amount of VRAM (4GB) and noted that users who have a PC which does not have PCIe power cables available have to "stick with the vanilla 1650" because the Super variant has a higher power consumption (100W vs. 75W).[52] Altogether, it has a "great overall value" and for now "claims the crown for best budget graphics card", being consistently faster than the RX 580 and coming close to the RX 590 when the VRAM is not a limiting factor.[52]

According to German IT online magazine Computerbase [de], the GTX 1650 Super is "faster than a GeForce GTX 1060 and […] delivers comparable FPS to AMD's Radeon RX 590."[53] They praised the GTX 1650 Super's price while criticizing its small amount of VRAM, claiming "The 4 GB memory these days is too small even in this [price] class."[53]

GTX 1660 Super

[edit]

PC Gamer found that the GTX 1660 Super "almost makes the 1660 Ti look redundant" and noted that "it outperforms anything AMD currently offers in the same price range (it's about 20 percent faster than the RX 590)."[54]

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.