Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • Morg. - Friday, May 11, 2012 - link

    nice trolling, C. disabling ANY unit is like disabling ANY core on a phenom x3 - or did you really think they only ever disabled core #3 ?
  • CeriseCogburn - Friday, May 11, 2012 - link

    You just blew your own point. AMD cpu's are no exception either - as about 50% of the dual core phenom 2's (according to this sites reviewers) successfully unlock into stable quadcores (another large percentage is user error, bios issue, meme incomatibility,etc) - so just CUTTING OFF default on states is not " a failed core" as you so stupidly state.
    Further, we can use your junk mentality about cores to comment on the recent 6950 - which unlocked - apparently their junk cores are junk wasted wasted crap binned down to 6950 because they sucked so badly they couldn't make themselves into being a 6970, right ?
    NO, you are not correct.
    I added a couple of amd examples because I noted your ongoing amd fanboyism, so hope to have jolted you from your spewing idiotic FUD with a few thoughts of your favorite little red friends...
    Here with the 680 "cut" into a 670 we have a very SMALL percentage of the core becoming not used, hence the chance that it "just happens to be a failed core that couldn't make the 680 grade" is VERY MINISCULE indeed.
    The vast majority are just "cut" for the 670, perhaps every single one, since we see they require a high clock and even overclock better than 680's generally and therefore are from a likely refined and tweaked process.
    I certainly wish you fanboys wouls stop spreading stupid put down lies, and then of course, only using them against your hated competition.
    There is a BIG DIFFERENCE between disabling two of four cores as amd did in so, so many amd cpus - giving them a chance to use a part that is near a full 50% piece of die failure crap (ROFL) - and ...

    Disabling one tiny of 8 functional shaders, leaving 7 of 8 useful, and likely in that case, having all 8 useful and just cutting the access.

    Given amd's 50 cpu core hacking cutoffs, "failed dies" may in fact apply to that, but NOT to the 670, except by very rare chance indeed.
  • CeriseCogburn - Friday, May 11, 2012 - link

    The GTX680 has been selling just about every day on the egg - the verified reviewers trickle in daily.
    So, what initially is a zero to the clueless zero who doesn't count not only amazon stock and tiger direct stock but about 50 other online and famous vendors and only looks at just newegg or worse yet just repeats a lie they heard because they liked the sound of it, is not really a zero but a trickle.
    Let us know if it goes much past the 2.5 month trickle of the 79xx series, and then add in an extra 2 weeks because of the absolutely pathetic amd lack of decent driver situation that is ongoing.... Yes I'll give nVidia 3 or 4 months since those that are being bought slowly in the opressed US economy at least have all their games working at first and SLI working and even STW2 works - not to mention the added features of target frame rate (use evga precision X) and adaptive v-sync in driver 301.24 out for some time now for ALL nVidia cards all the way back to the hated and dreaded by amd fanboys massively re-released and branded "rebrand" G80 or 8 series GPU's.
    You may send nVidia a big thank you email for providing massive ongoing value for their entire user base with awesome driver updating, absolutely AWESOME.
  • Wreckage - Thursday, May 10, 2012 - link

    So AMD now has the 4th fastest card on the market. How far the might have fallen.
  • raghu78 - Thursday, May 10, 2012 - link

    You are out of your mind. In fact AMD's competitiveness in the most demanding games has increased. AMD's 4XAA / 8X AA performance in Batman Arkham city has been fixed with the latest drivers.In fact in Batman Arkham city at 8X AA AMD is now faster . check other websites like legitreviews. In Shogun 2 Nvidia's performance has been hit severely due to the latest patch / driver situation and its clearly behind HD 7970 at 2560 x 1600 and 5760 x 1080. In the most demanding games like BF3, Alan Wake, Crysis 2, Batman Arkham City, The Witcher 2, Shogun 2, Anno 2070, the HD 7970 will easily defeat the GTX 680 given that HD 7970 1Ghz - 1.1 Ghz ( Sapphire HD 7970 Dual X, MSI HD 7970 Lightning (1070) , Powercolor PCS+ HD 7970 Vortex II (1100) ) is available in retail.
    And not to forget the beautiful performance scaling of the HD 7970. Most users are running the HD 7970 OC editions at 1150 Mhz on stock voltage. Some are pushing it to 1250 with extra voltage. The HD 7970 is a true single GPU beast with the raw power, memory bandwidth and size to handle the most demanding games of today and tomorrow.
  • maximumGPU - Thursday, May 10, 2012 - link

    im not sure where you get your info from. Most reviews i read placed the 680 above the 7970.

    in fact since you mention legitreviews:

    " Our testing showed that this card did phenomenally well with DirectX 11 game titles and is currently the overall fastest graphics card for gaming.."
  • raghu78 - Thursday, May 10, 2012 - link

    http://www.legitreviews.com/article/1925/4/

    "For benchmark testing of Batman: Arkham City we disabled PhysX to keep it fair and ran the game in DirectX 11 mode with 8x MSAA enabled and all the image quality features cranked up. You can see all of the exact settings in the screen captures above. "

    Look at the Sapphire HD 7970 Dual X (48 fps) trample the GTX 680 (39 fps)

    And if you look at Metro 2033 (DOF with 4X MSAA) . DOF uses compute shaders. really demanding on bandwidth

    http://www.legitreviews.com/article/1925/10/

    The Sapphire HD 7970 Dual X and MSI HD 7970 Lightning are 19% and 23% faster than GTX 680 respectively.

    In the most demanding games the HD 7970 shows its raw power. Whats the point in looking at games doing 100 fps. its for suckers like you who don't think at all. Unless you are into multiplayer combat once you cross 60+ fps it doesn't make any difference in normal gaming. scenarios.
  • CeriseCogburn - Thursday, May 10, 2012 - link

    In a single cherry picked instance after a year of amd driver failure, and likely a current cheat with driver chumming, the amd card has magically tweaked out a long lose into a single win - and suddenly you forgot the months on end we all suffered with our overpriced 7970 crapster crashing on every other game randomly.
    Way to go with pr fanboy passion.
    Now see this http://translate.google.pl/translate?hl=pl&sl=...

    Oh well, it was fun while it lasted for a single game. Just remember don't turn up the tesselation, make sure to tweak the tesselation cheater slider in amd's CCC bloated ad serving pig, then keep the head held high as the random crash and hours of reinstalling and fixing the newly discovered bugs from the latest driver keep you occupied. No PhysX, they turned it off in your proof game because amd can't hang. ROFL
    I amd and I can't play, crip the game so I can win, because I can't play with the eye candy turned on.
  • CeriseCogburn - Friday, May 11, 2012 - link

    Don't forget about the massive tearing with amd gpu's that is not present on the nVidia implementations.

    http://hardforum.com/showthread.php?p=1038706634

    Then of course almost everyone now recommends 680 in case of going to SLI in the future, because it's so much better than CF, including the reviewer above - so much more important than...

    imaginary future 3G never coming lifetime for a single outside the box explicitly setup and tweaked for amd benchmark with just the right settings and game and endless hours searching for the combo to show a single instance... (I guess a couple years of amd fanboys whacking away might find a big lie they can hope to offer)
  • Wreckage - Thursday, May 10, 2012 - link

    The first game you mention is BF3
    http://www.anandtech.com/show/5818/nvidia-geforce-...

    The 670 absolutley destroys the 7970.

    You are trying too hard to get into the AMD focus group. Just stick to the material they send you.

Log in

Don't have an account? Sign up now