Meet The EVGA GeForce GTX 1660 Super SC Ultra

Since this latest GTX 1660 series card launch is a virtual launch like the others, NVIDIA is once again sampling partner cards to the press. For the GTX 1660 Super launch, we received EVGA’s GeForce GTX 1660 SC Ultra, a relatively straightforward dual slot, dual fan card.

GeForce GTX 1660 Super Card Comparison
  GTX 1660 Super
(Reference Specification)
EVGA GTX 1660 Super SC Ultra
Base Clock 1530MHz 1530MHz
Boost Clock 1785MHz 1830MHz
Memory Clock 14Gbps GDDR6 14Gbps GDDR6
TDP 125W 125W
Length N/A 8"
Width N/A 2-Slot
Cooler Type N/A Open Air, Dual Fan
Price $229 $229

The EVGA GeForce GTX 1660 SC Ultra – or for the sake of brevity in this article, the EVGA SC Ultra – is pretty typical of dual fan cards for this price range and performance class. At just under 8-inches long, the card isn’t particular massive, and is just big enough to fit a pair of 90mm fans.

EVGA typically offers both reference clock (Black) and factory overclocked versions of their GTX 1660 SC cards, and the model we’re looking at is the latter, sporting an official boost clock of 1830MHz, 45MHz higher than the reference GTX 1660 Super specifications. Notably, the card’s base clock remains unchanged at 1530MHz, but since the card is turboing all the time anyhow, it’s the boost clock tells us what we need to know. By the numbers then, EVGA’s SC Ultra has a 2.5% GPU clockspeed advantage over reference cards, so it will perform just a bit better (and further close the gap with the GTX 1660 Ti), but not significantly so with such a mild factory overclock.

Meanwhile, it’s worth pointing out that while EVGA has cranked up the GPU clockspeed a bit, it doesn’t appear they’ve deviated from NVIDIA’s reference TDP this time. So unlike their reference clocked but TDP increased XC cards that we looked at earlier this year, the SC Ultra doesn’t have any more power headroom to play with than reference cards. Thankfully for EVGA the GTX 1660 series cards haven’t been terribly TDP-constrained to begin with – we regularly beat NVIDIA’s official boost clocks – but it does mean that the card is riding on the fact that it’s clocked higher at any given voltage, which is essentially an efficiency play.

Digging down further, the SC Ultra is equipped with a fairly typical aluminum heatsink with vertical fins, and a pair of heatpipes running through it. The heatpipes are connected to the TU116 GPU via a copper (or copper-plated) baseplate, with the heatpipes and baseplate transferring heat into the heatsink proper. EVGA is also providing direct cooling for the 6 GDDR6 chips surrounding the GPU, using thermal pads to connect them to the heatsink.

Speaking of cooling, unlike their XC cards that we’ve looked at earlier this year, for their SC cards EVGA has implemented zero fan speed idle functionality, so the card’s fans will cut off when the card is idling at a low temperature. This is an increasingly common and always appreciated feature for video cards, as it allows cards to be effectively silent at idle or under light workloads.

Elsewhere, on the back of the card we find an EVGA backplate. It’s still uncommon to see backplates among non-premium midrange cards like this, so EVGA is in fairly rare company. In the case of the SC Ultra however, it’s also necessary, as the heatsink is only mounted to the PCB directly around the GPU, so the backplate is needed to keep the card and its PCB from flexing.

For power, the card relies on an 8-pin external PCIe power cable, as well as PCIe slot power. From a practical perspective this is overkill for a card that only has a 125W TDP – a 6 pin connector should work just fine here – but it’s consistent with NVIDIA’s requirements for their other GTX 1660 cards.

Finally for hardware features, for display I/O we’re looking at EVGA’s usual GTX 1660 setup of a DisplayPort, an HDMI port, and a DL-DVI-D port. While DVI ports have long been banished from high-end cards, they still sometimes show up on midrange cards so that they can be directly used with DL-DVI monitors that are still out in the world.  The tradeoff, as always, is that the DVI port is taking up space that could otherwise be filed by more DisplayPorts, so you’re only going to be able to drive up to two modern monitors with the SC Ultra.

As for software, EVGA of course includes their latest EVGA Precision X1 software. Among its many features, Precision X1 allows modifying the voltage-frequency curve and scanning for auto-overclocking as part of Turing’s GPU Boost 4. Of course, NVIDIA’s restriction of actual overvolting is still in place, and for Turing there is a cap at 1.068v. As far as overclocking software goes, EVGA’s software remains one of the best NVIDIA overclocking tools on the market, and it’s little surprise that when NVIDIA wants to show off new overclocking features, they frequently turn to EVGA to deliver the necessary software.

Finally, on a quick housekeeping note, I should point out that for today’s review we’re actually on our second SC Ultra. While the original card sampled by NVIDIA worked just fine for testing from a performance perspective, under sustained loads it got hotter and louder than any 125W card should. EVGA has since swapped it out for a second card, which dialed back the temperatures by several degrees, as well as a few decibels of noise. While it’s still not a quiet card – as we’ll see in the power/temperature/noise section of the review, our second sample was a lot more reasonable and is what we’d expect to see for a mid-TDP dual fan card.

The NVIDIA GeForce GTX 1660 Super Review NVIDIA Driver Feature Updates & The Test
Comments Locked


View All Comments

  • eastcoast_pete - Wednesday, October 30, 2019 - link

    Thanks Ryan, and sorry, the 1660 was already "all Turing", so my question was redundant. I meant to ask about the 1650 Super. If that GPU remains unchanged, it still is a cut Turing GPU with Volta NVENC.
  • timecop1818 - Wednesday, October 30, 2019 - link

    Actually 1660 (not super) already has the Turing NVDEC/NVENC, because it's the first card which can handle 8K60P decode with ~70% NVDEC utilization. On 1080/1080Ti (Pascal) this runs at around 40fps and 100% utilization.

  • timecop1818 - Wednesday, October 30, 2019 - link

    I'm surprised nobody said "Fuck DVI" yet.
    At least about 1/3 of the AIB makers finally dumped that retardo connector.
    I bought a gigabyte? or something GTX1660 and it was finally a proper card with 3x DP and 1x HDMI.
  • Korguz - Wednesday, October 30, 2019 - link

    considering monitors are STILL made with vga... thats what they should stop making before they drop dvi...
  • Gastec - Wednesday, October 30, 2019 - link

    And monitors could still be made with VGA connectors for 50 more years to come and the U.S. Military just dumped the floppy disk from their nuclear missiles controls.
    From my of experience, both a work and in private life, this lack of knowledge and desire to upgrade is not an exception but the rule. I have a friend that just turned 30, he knows every social networking trick and settings for his smartphone but connects his laptop to a small monitor via the bundled VGA cable that came in the box. He didn't even know the monitor had a DVI port, or what that even is.
  • MaikelSZ - Wednesday, October 30, 2019 - link

    My monitors has VGA, DVI and HDMI. To this day, all the HDMI connections that I have used on PCs and TVs have given me problems of some kind.
    An interesting problem that I have seen 3 diferent times in 3 diferente places was that in one position the cable gives image problems (small distortions) or in others even the TV loses the image for a couple of seconds every so often. If the cable was reversed, the problem disappeared.

    My graphics card has 3 DP and 1 HDMI and I use a DP-DVI converter for my monitor, I don't use the HDMI. I only use HDMI when I connect 2 monitors, one HDMI and the other using DVI
  • grazapin - Wednesday, November 6, 2019 - link

    That sounds like a bad HDMI cable. More specifically, one end of the cable is bad and has intermittent connection problems. When you plug the bad end into the laptop it will flake out because the cable is more likely to be bumped or jiggled and the cable is likely bending to the side and pulling on the connector. When you plug the bad end into the monitor the cable is more likely to be straight so it's not putting stress on the connection and it's not get jostled after you plugged it in, and the good end is out where the jostling occurs. Replace that HDMI cable and I bet your problems go away.
  • Nirman04 - Wednesday, October 30, 2019 - link

    It will be interesting to see the effect this has on the market. If a 1660 Super is only $10 more than the "vanilla" 1660 and yet performs closer to the Ti card which is $60 more than the 1660, I can't see anyone buying the1660 now, never mind the Ti. Clearly a lot will come down to the pricing from a individual manufacturers who could now cut the price of the 1660 and 1650, but it looks like there is now competition even between Nvidia cards, let alone competition with AMD.
  • Larry Litmanen - Friday, November 1, 2019 - link

    To me it is wait and see, what if Stadia works.

    Why spend money on something that will not run any games in 2 years.

    My gtx 960 can run games only on lowest settings on a Dell U3415w, it basically stopped running new games around 2017.

    The card works it's just it's not powerful enough, frankly I have no desire to spend $220 on something that is useless in 2 years.
  • dromoxen - Friday, November 8, 2019 - link

    Too fast development of new gfx cards renders the older cards redundant too soon. I have gtx960 with 4gb for futureproof *sigh* . This HAS to stop.

Log in

Don't have an account? Sign up now