The NVIDIA GeForce GTX 1650 Super Review, Feat. Zotac Gaming: Bringing Balance To 1080pby Ryan Smith on December 20, 2019 9:00 AM EST
Bringing 2019 to a close in the GPU space, we have one final video card review for the year: NVIDIA’s GeForce GTX 1650 Super. The last of the company’s mid-generation kicker cards to refresh the lineup in the second-half of 2019, the GTX 1650 Super is designed to shore up NVIDIA’s position in the sub-$200 video card market, offering solid performance for 1080p gaming without breaking the bank. At the same time, it’s also NVIDIA’s response to AMD’s new Radeon RX 5500 XT series of cards, which having landed last week, significantly outperform the original GTX 1650.
Like the other Super cards, these refresh parts serve to shore up NVIDIA’s competitive positioning against AMD’s Radeon RX 5000 series cards, offering improved performance-per-dollar at every tier. NVIDIA has taken some flak for uncompetitive pricing, and this is not unearned. And it goes perhaps most for the original GTX 1650, which although is easily the best 75W card on the market, it’s always been surpassed in value by AMD’s cards; first the last-generation Polaris cards, and now the new Navi cards. So with the GTX 1650 Super, NVIDIA and its partners finally get a chance to rectify this with a more competitive part, that while no longer fitting into the original’s 75W niche, offers better performance all around.
|NVIDIA GeForce Specification Comparison|
|GTX 1660||GTX 1650 Super||GTX 1650||GTX 1050 Ti|
|Memory Clock||8Gbps GDDR5||12Gbps GDDR6||8Gbps GDDR5||7Gbps GDDR5|
|Memory Bus Width||192-bit||128-bit||128-bit||128-bit|
|Single Precision Perf.||5 TFLOPS||4.4 TFLOPS||3 TFLOPS||2.1 TFLOPS|
|Manufacturing Process||TSMC 12nm "FFN"||TSMC 12nm "FFN"||TSMC 12nm "FFN"||Samsung 14nm|
From a pure hardware perspective, perhaps the most interesting thing about the GTX 1650 Super is that, unlike the other Super cards, NVIDIA is giving the new Super card a much bigger jump in performance over its predecessor. With a of 46%, increase in GPU throughput and faster 12Gbps GDDR6 memory, the GTX 1650 Super is much farther ahead of the GTX 1650 than what we saw with October’s GTX 1660 Super launch, relatively speaking.
The single biggest change here is the GPU. While NVIDIA is calling the card a GTX 1650, in practice it’s more like a GTX 1660 LE; NVIDIA has brought in the larger, more powerful TU116 GPU from the GTX 1660 series to fill out this card. There are cost and power consequences to this – the 284mm2 is a very large chip to be selling for $159 – but the payoff is that it gives NVIDIA a lot more SMs and CUDA Cores to work with. Coupled with that is a small bump in clockspeeds, which pushes the on-paper shader/compute throughput numbers up by just over 46%.
Such a large jump in GPU throughput also requires a lot more memory bandwidth to feed the beast. As a result, just like the GTX 1660 Super, the GTX 1650 Super gets the GDDR6 treatment as well. Here NVIDIA is using slightly lower (and lower power) 12Gbps GDDR6, which is attached to the GPU via a neutered 128-bit memory bus. Still, this one change gives the GTX 1650 Super 50% more memory bandwidth than the vanilla GTX 1650, very close to its increase in shader throughput.
Do note, however, that not all aspects of the GPU are being scaled out to the same degree. In particular, the GTX 1650 Super still only has 32 ROPs, with the rest of TU116’s ROPs getting cut off along with its spare memory channels. This means that while the GTX 1650 Super has 46% more shader performance, only has 4% more ROP throughput for pushing pixels. Counterbalancing this to a degree will be the big jump in memory bandwidth, which helps to keep those 32 ROPs well-fed, but at the end of the day the GPU is getting an uneven increase in resources, and gaming performance gains do reflect this at times.
The drawback to all of this, then, is power consumption. While the original GTX 1650 is a 75 Watt card – making it the fastest thing that can be powered solely by a PCIe slot – the Super-sized card is officially a 100 Watt product. This gives up the original GTX 1650’s unique advantage, and it means builders looking for even faster 75W cards won’t get their wish, but it’s the power that pays the cost of the GTX 1650 Super’s higher performance. Traditionally, NVIDIA has held pretty steadfast at 75W for their xx50 cards, but then again at the end of the day, despite the name, this is closer to a lightweight GTX 1660 than it is a GTX 1650.
Speaking of hardware features, besides giving NVIDIA a good deal more in the way of GPU resources to play with, the switch from the TU117 GPU to the TU116 GPU also has one other major ramification that some users will want to pay attention to: video encoding. Unlike TU117, which got the last-generation NVENC Volta video encoder block for die space reasons, TU116 gets the full-fat Turing NVENC video encoder block. Turing’s video encode block has been turning a lot of heads for its level of quality – while not archival grade, it’s competitive with x264 Medium – which is important for streamers. This also led to TU117 and the GTX 1650 being a disappointment in some circles, as an otherwise solid video card was made far less useful for video encoding. So with the GTX 1650 Super, NVIDIA is resolving this in a roundabout way, thanks to the use of the more powerful TU116 GPU.
Product Positioning & The Competition
As is always the case in the lower-end segment of NVIDIA’s product stack, the GTX 1650 Super is a pure virtual launch. This means that NVIDIA hasn’t put together a retail reference design, and all of the cards are based on partner designs.
At this point, the partners have been shipping TU116 and TU117-based cards for over 8 months, so they have been able to hone their GTX 16-series designs. This means that they’ve been able to hit the ground running, with existing designs ready for quick modification or straight reuse right away. The net result is that the newest GTX 1650 Supers look like and are built like the GTX 1660 and GTX 1650 cards that have preceded them.
Within NVIDIA’s product stack then, the GTX 1650 Super is not a wholesale replacement for the GTX 1650 – that card is still sticking around – but the GTX 1650 Super is going to be the value option for this performance segment. For consumers and OEMs who need 75W cards (for cooling or power reasons), then the $149 GTX 1650 remains the best choice. For everyone else, the GTX 1650 Super offers a whole lot more performance for $10 more. And while it’s not going to be performing on the same level as NVIDIA’s $200+ GTX 1660 cards, the GTX 1650 Super packs enough horsepower that it’s not going to be too far behind.
|NVIDIA GeForce 20/16 Series (Turing) Product Stack|
|RTX 20 Series||GTX 16 Series|
|RTX 2080 Ti||GTX 1660 Ti|
|RTX 2080 Super||GTX 1660 Super|
|RTX 2070 Super||GTX 1660|
|RTX 2060 Super||GTX 1650 Super|
|RTX 2060||GTX 1650|
It’s looking outside of NVIIDA’s product stack where we find the real competition for the GTX 1650 Super: AMD’s new Radeon RX 5500 XT series cards, particularly the $169 4GB model. While NVIDIA did not directly call out AMD when first revealing the GTX 1650 Super, the timing – between the RX 5500 series announcement and RX 5500 XT launch – leave no doubt in that respect. NVIDIA has been seemingly content to let AMD hold the $150-$200 market with their RX 580/570 cards, but with the latest AMD launch, that passive positioning has come to an end.
As I noted in last week’s RX 5500 XT review, there’s not quite a 1:1 match between Radeon and GeForce parts right now. The 4GB RX 5500 XT is $10 more expensive than the GeForce GTX 1650 Super, which as Ian Cutress made note of when we were discussing this article earlier this week, in the sub-$200 market customers are typically buying what they can afford. So even $10 matters in some cases. Still, it’s in NVIDIA’s best interests to meet or beat the RX 5500 XT 4GB on performance, to deny AMD that edge. And meanwhile NVIDIA generally has the edge on energy efficiency, though it’s no longer the one-sided fight it was against AMD’s Polaris-based RX 500 series cards.
Finally, the wild card factor here is once again going to be gaming bundles. NVIDIA doesn’t offer one, but AMD does. Along with a 3-month trial for Microsoft’s Xbox Games Pass program, the company is bundling the forthcoming “Master Edition” of Monster Hunter: Iceborne. We don’t often see game bundles with sub-$200 cards, so the inclusion of one can be a powerful factor in this segment of the market, since a game is a more significant fraction of the value of a card.
Though with most of Newegg’s video card stock being anything but in stock, just getting a card is a challenge right now. The first wave of GTX 1650 Super cards have done pretty well sales-wise, so the video card retailer has all of two models of GTX 1650 Super in stock, and Amazon is much the same.
|Holiday 2019 GPU Pricing Comparison|
|Radeon RX 5700||$319||GeForce RTX 2060|
|$279||GeForce GTX 1660 Ti|
|$229||GeForce GTX 1660 Super|
|Radeon RX 5500 XT 8GB||$199/$209||GeForce GTX 1660|
|Radeon RX 5500 XT 4GB||$169/$159||GeForce GTX 1650 Super|
|$149||GeForce GTX 1650|
Post Your CommentPlease log in or sign up to comment.
View All Comments
WetKneeHouston - Monday, January 20, 2020 - linkI got a 1650 Super over the 580 because it's more power efficient, and anecdotally I've experienced better stability with Nvidia's driver ecosystem.
yeeeeman - Friday, December 20, 2019 - linkIt is as if AMD didn't have a 7nm GPU, but a 14nm one.
philosofool - Friday, December 20, 2019 - linkCan we not promote the idea, invented by card manufacturers, that everyone who isn't targeting 60fps and high settings is making a mistake? Please publish some higher resolution numbers for those of us who want that knowledge. Especially at the sub-$200 price point, many people are primarily using their computers for things other than games and gaming is a secondary consideration. Please let us decide which tradeoffs to make instead of making assumptions.
Dragonstongue - Friday, December 20, 2019 - link100% agreed on this.
Up to the consumers themselves how where and why they will use the device as they see fit, be it gaming or streaming or "mundane" such as watching videos or even for emulation purposes, sometimes even "creation" purposes,
IMO is very related to the same BS crud smartphone makers use(used) to ditch 3.5mm jacks "customers do not want them anymore, and with limited space we had no choice"
so instead of adjusting the design to keep the 3.5mm jack AND a large enough battery, the remove the jack, limit the battery size ~95% are all fully sealed cannot replace battery as well as nearly all of them these days are "glass" that is by design pretty but also stupid easy to break so you have not choice but to make a very costly repair and/or buy a new one.
with GPU they CAN make sure there are DL-DVI connector HDMI full size DP port (with maybe 1 mini DP)
they seem to "not bother" citing silly reasons "it is impossible / customers no longer want this"
As well as you point out..the consumer decides the usage case, provide the best possible product, give the best possible NO BS review/test data and we the consumer will see or not see therefore decide with the WALLET if it is worth it or not.
Likely save much $$$$$$$$$$ and consumer <3 by virtue of not buying something they will inadvertently regret using in the first place.
Hell I am using and gaming with a Radeon 7870 @ 144Hz monitor 1440p (it only runs at 60Hz due to not fully supporting higher than this) However I still manage to game on it "just fine" maybe not ultra spec everything, but comfortably (for me) high to medium "tweaked" settings.
Amazing how long this last when they are built properly and not crap kicked out of it...that and well not having hundreds to thousands to spend every year or so (which is most people these days) should mean so much more to these mega corps than "let us sell something that most folks really do not need, let us make it right and upgrades will happen when they really need to instead of just ending in the E trash can in a few months time"
timecop1818 - Friday, December 20, 2019 - linkDVI? No modern card should have that garbage connector. Just let it die already.
Korguz - Friday, December 20, 2019 - linkyea ok sure... so you still want the vga connector instead ???
Qasar - Friday, December 20, 2019 - linkdvi is a lot more useful then the VGA connector that monitors STILL come with. but yet we STILL have those on new monitors. no modern monitor should have that garbage connector
The_Assimilator - Saturday, December 21, 2019 - linkNo VGA. No DVI. DisplayPort and HDMI, or GTFO.
Korguz - Sunday, December 22, 2019 - linkvga.. dead connector, limited use case, mostly business... dvi.. still useful, specially in KVMs... havent seen a display port KVM.. and the HDMI KVM, died a few months after i got it.. but the DVI KVMs i have.. still work fine. each of the 3, ( dvi, hdmi and display port ) still have their uses..
Spunjji - Monday, December 23, 2019 - linkDisplayPort KVMs exist. More importantly, while it's trivial to convert a DisplayPort output to DVI for a KVM, you simply cannot fit the required bandwidth for a modern high-res DP monitor through a DVI port.
DVI ports are large, low-bandwidth and have no place on a modern GPU.