Comments Locked

449 Comments

Back to Article

  • PeckingOrder - Wednesday, June 29, 2016 - link

    What a massive F-up by AMD.
  • evolucion8 - Wednesday, June 29, 2016 - link

    Sure, cause AMD midrange GPUs are meant to replace previous high end of GPUs lol
  • ddriver - Wednesday, June 29, 2016 - link

    In the chart, the AMD Radeon RX 480 (4GB) is listed as having 8 gigs of vram.
  • ddriver - Wednesday, June 29, 2016 - link

    I mean the "AMD Radeon GPU Specification Comparison" chart.
  • SunnyNW - Wednesday, June 29, 2016 - link

    Ryan Shrout of PCper said Every RX480 actually has 8GB of memory on the board..Like WTF...He further added that they sent bios to switch the cars between 4GB and 8GB. I understand the artificial prodcuct segmentation that often happens in tech but with the large number AMD hopes to sell that is A LOT of wasted memory! WOW what a waste they should have just had 8GB reference only and priced it 10 to 15 bucks less...
  • Drumsticks - Wednesday, June 29, 2016 - link

    This is only for the press. The retail 4GB cards have 4GB of VRAM, per the AMA on reddit.
  • akamateau - Wednesday, June 29, 2016 - link

    4 and 8gb are currently being released.

    Amazon here:

    https://www.amazon.com/XFX-Radeon-Graphics-Cards-R...

    Newegg here:

    http://www.newegg.com/Product/Product.aspx?Item=N8...
  • akamateau - Wednesday, June 29, 2016 - link

    2 RX 480 CRUSHES GTX 1080 for $200 less!!!
  • basroil - Wednesday, June 29, 2016 - link

    "2 RX 480 CRUSHES GTX 1080 for $200 less!!!"

    That's if any motherboards even support that configuration! Tests have shown that the card:
    1) Draws closer to 165W, much higher than it's actual supported maximum power draw of 150W
    2) ~80W of that is from the PCIe slot itself. Motherboards are only required to allow 75W for ALL PCIe slots. Either way it will overload the traces on cheaper boards.

    Put those together and you have a nightmare fuel of either frying the mobo with >150W draw when even 2x1080 wouldn't hit 75W, or frying your PSU as the cards reroute power to the PCIe power cable and overload the 75W capable 6pins with double their rated amperage.
  • binarydissonance - Wednesday, June 29, 2016 - link

    Or both the mobo and the PSU are supplying the same voltage and the power input is combined into a single bus... y'know... preventing the unlikely scenario you describe from ever possibly happening.
  • basroil - Thursday, June 30, 2016 - link

    "Or both the mobo and the PSU are supplying the same voltage and the power input is combined into a single bus... y'know... preventing the unlikely scenario you describe from ever possibly happening."

    1) The two do NOT have the same voltage. Ideally they do but that's not how things actually work in practice.
    2) The folks at tomshardware did bus level analysis of power draws and put their results into their review. Their tests for various cards will prove to you that power draw can indeed be modified to either PCIe slot or power cable and is not 50-50 like you claim.
    3) Even assuming that your point was valid (which it most certainly is NOT), it wouldn't change the fact that a single card already draws more power from the PCIe slot than allowable by ATX specifications, and that two cards will be far more than the specs allow (double the spec for PCIe3.0)
  • schulmaster - Thursday, June 30, 2016 - link

    Lol. The PSU is the source for all board power AND PCIE Aux. The board design and PSU will negotiate how much 12V power is reliably sourced from the 24pin. A 6pin PCIe aux is rated for an additional 75W, and that limit could be down to the cable itself, let alone the card interface and/or the PSU. Even high-end OC boards have a supplemental molex connector for multi GPU configs to supplement available bus power, which is the burden of the 24pin. It is not outlandish to have concern if a single RX480 is overdrawing from the entire PCIe bus wattage allotted in the spec, especially when the fall back is a PCIe 6 pin already being overdrawn from as well. Tomshardware was literally unwilling to due further multiGPU testing due to the numbers they were physically seeing, not paranoia.
  • pats1111 - Thursday, June 30, 2016 - link

    @binarydissonance: Don't confuse these fanboys with the facts, they're NVIDIA goons, it's a waste of time because they are TROLLS
  • AbbieHoffman - Wednesday, June 29, 2016 - link

    Actually most motherboards support crossfire. There are many that support only crossfire. Because it is cheaper to make crossfire support than SLI.
  • Gigaplex - Thursday, June 30, 2016 - link

    But they don't support the excessive power consumption on the PCIe bus, which is a specification violation.
  • jospoortvliet - Monday, July 4, 2016 - link

    Luckily every motherboard except for cheap ones that are quite old can handle easily 100+ watt over the PCIe port, as any over clocking would need that, too.
  • beck2050 - Thursday, June 30, 2016 - link

    I just laugh when I see people talking about Crossfire
  • fanofanand - Thursday, June 30, 2016 - link

    "when even 2x1080 wouldn't hit 75W"

    Your post is so full of FUD it should be deleted.
  • basroil - Thursday, June 30, 2016 - link

    "Your post is so full of FUD it should be deleted. "

    I'm not responsible for your ignorance. Check tomshardware /reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html and you'll see I'm right
  • fanofanand - Thursday, June 30, 2016 - link

    I checked, you are wrong. Stop spreading FUD, you Nvidiot.
  • D. Lister - Thursday, June 30, 2016 - link

    @fanofanand

    What he is saying is, that the total power draw of the 2x 1080 from the motherboard is less than 75W, because they take most of the power from the 8-pin connectors, which is true. The same statement is also true for the Radeon 290X, a GPU well-known for massive power use. But even 2x 290/290Xs don't draw so much from the board.

    Please refrain from abusive language and think before you post, because it only puts you, your argument and your brand of choice, in a negative light.
  • fanofanand - Friday, July 1, 2016 - link

    You are choosing to scold me for calling him out on his word choices that portray an inaccurate statement? Of all the people on here and all the things they say? Mr. Lister, kindly F off.
  • D. Lister - Saturday, July 2, 2016 - link

    :) well, no one who has ever scolded me, used the word "please". Anyhow, you misunderstood the guy, didn't conduct any research in the matter, and instead of replying with an opposing argument/reference you asked for his post to be deleted and then called him an idiot. Acting like that only serves to push you in a weaker position, and drags AT down to the level of places like wccftech. Anyway, I will now kindly F off.
  • Murloc - Tuesday, July 5, 2016 - link

    lol what he says puts "his brand of choice" in negative light'

    guys please get a grip
  • D. Lister - Tuesday, July 5, 2016 - link

    Yeah, complicated, but you'll figure it out once you're done laughing out loud. :)
  • Mode+ - Thursday, June 30, 2016 - link

    Standard 12V 6 pin connector is rated up to 192W without the HCS terminals. The specification is for 150W. It's a guideline, not a rule.
  • TheinsanegamerN - Thursday, June 30, 2016 - link

    And my car is rated to go up to 120MPH. Doesnt mean that it will work well for long at that speed.
  • Rumpeltroll80 - Monday, July 4, 2016 - link

    That is why many are releasing the RX 480 with a 8 pin instead of 6 pin. 150 Watt on a 8 pin, only 75 watt on a 6 pin.
    So problem seems to be solved then right? Here is one card with the change
    http://www.tweaktown.com/news/52878/sapphires-upco...
  • komplik - Thursday, June 30, 2016 - link

    Really? https://www.techpowerup.com/reviews/AMD/RX_480_Cro...

    In average... slower than single 1070
  • erple2 - Friday, July 1, 2016 - link

    I didn't see that from the charts - I saw it to be faster. Personally, I ignored resolutions below 1440p, as buying 2 of these for those resolutions seems pointless.

    In the aggregate including games where there was no crossfire scaling, the 1070 was competitive, and slightly faster.
  • Flunk - Thursday, June 30, 2016 - link

    The Crossfire reviews I've read have said the GTX 1070 is faster on average than RX 480 Crossfire, maybe you should go read those reviews.
  • Murloc - Tuesday, July 5, 2016 - link

    comparing crossfire/sli to a single gpu is really useless. Multigpu means lots of heat, noise, power consumption, driver and game support issues, and performance that is most certainly not doubled on many games.

    Most people want ONE video card and they're going to get the one with the best bang for buck.
  • R0H1T - Wednesday, June 29, 2016 - link

    For $200 I'll take this over the massive cash grab i.e. FE obviously!
  • Wreckage - Wednesday, June 29, 2016 - link

    Going down with the ship eh? It took AMD 2 years to compete with the 970. I guess we will have to wait until 2018 to see what they have to go against the 1070
  • looncraz - Wednesday, June 29, 2016 - link

    Two years to compete with the 970?

    The 970's only advantage over AMD's similarly priced GPUs was power consumption. That advantage is now gone - and AMD is charging much less for that level of performance.

    The RX480 is a solid GPU for mainstream 1080p gamers - i.e. the majority of the market. In fact, right now, it's the best GPU to buy under $300 by any metric (other than the cooler).

    Better performance, better power consumption, more memory, more affordable, more up-to-date, etc...
  • stereopticon - Wednesday, June 29, 2016 - link

    are you kidding me?! better power consumption?! its about the same as the 970... it used something like 13 lets watts while running crysis 3... if the gtx1060 ends up being as good this card for under 300 while consuming less watts i have no idea what AMD is gonna do. I was hoping for this to have a little more power (more along 980) to go inside my secondary rig.. but we will see how the 1060 performance.

    i still believe this is a good card for the money.. but the hype was definitely far greater than what the actual outcome was...
  • adamaxis - Wednesday, June 29, 2016 - link

    Nvidia measures power consumption by average draw. AMD measures by max.

    These cards are not even remotely equal.
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    "Nvidia measures power consumption by average draw. AMD measures by max."

    That's completely false.
  • CiccioB - Friday, July 1, 2016 - link

    Didn't you know that when using AMD HW the watt meter switches to "maximum mode" while when applying the probes on nvidia HW it switched to "average mode"?

    Ah, ignorance, what a funny thing it is
  • dragonsqrrl - Friday, July 8, 2016 - link

    @CiccioB

    No I didn't, source? Are you suggesting that the presence of AMD or Nvidia hardware in a system has some influence over metering hardware use to measure power consumption? What about total system power consumption from the wall?

    At least in relation to advertised TDP, which is what my original comment was referring to, I know that what adamaxis said about avg and max power consumption is false.
  • Drumsticks - Wednesday, June 29, 2016 - link

    I have heard this is true, but what does Anandtech (and many others) measure by? Most reviews show the 480 and the 970 very close in power draw.

    Polaris definitely appears to be a little bit more efficient than Maxwell, but likely less a good bit less efficient than Pascal. It's definitely better than the 3XX vs Maxwell days though.
  • looncraz - Wednesday, June 29, 2016 - link

    Compared to the 970, the RX 480 has better power consumption for the performance, yes. Not by much, but that value should increase with AIB versions as the reference runs rather warm and has a less than efficient VRM.

    Only time will tell how that consumption changes with clock speeds and how much overclocking can be genuinely achieved from the chip, but at this point, stock vs stock, the RX480 is a better buy than the 970.

    The 1060 is another matter altogether. It will certainly use less power and clock better than the RX 480, so we can assume that it will perform better as well - with nVidia simply tweaking its default clocks to best the RX 480. Still, it looks like the 1060 will have only 3GB or 6GB of RAM, so the cheaper RX 480 will fair better in RAM-heavy scenarios than the cheaper 1060.
  • just4U - Wednesday, June 29, 2016 - link

    There is no guarantees on the 1060.. especially if it's 128bit. The 960 was a little on the disappointing side. Especially considering what you had to pay for it. (I know.. I own a 960 4G and a 380..)
  • fanofanand - Thursday, June 30, 2016 - link

    I wouldn't know how disappointing the 960 was, seeing as Anandtech never thought it worthy of review. *zing*
  • gnawrot - Wednesday, June 29, 2016 - link

    I am pretty sure that AMD can deliver Vega in 2017 if they want to. In the mean time they will release their server CPUs (more profitable market). They cannot launch that many products at the same time.
  • Michael Bay - Wednesday, June 29, 2016 - link

    Their server CPUs which nobody even wants. This particular battle was lost a decade ago.
  • Domaldel - Thursday, June 30, 2016 - link

    Not really if what we've heard about Zen is correct.
    32 cores and 64 threads with each core roughly matching Broadwell clock for clock at a lower price point?
    While all of that could of course be bull there's also potential here for some serious disruption in the CPU market.
    And Zen is probably higher priority for AMD then the GPU market as a whole.
  • Gigaplex - Thursday, June 30, 2016 - link

    I doubt they'll match Broadwell clock for clock. They'll probably fall in the Nehalem to Sandy Bridge range. That should be enough though to cause some disruption in the CPU market if the price is right.
  • mikato - Friday, July 29, 2016 - link

    So are you saying, Michael Bay, that if they released a CPU that was competitive or better price-performance-efficiency and Intel's then it wouldn't be successful? It sounds like that is what you are saying.
  • r@ven - Wednesday, June 29, 2016 - link

    What are you stupid? The R9 390 was already made in the time to compete the 970. And even faster in 1440P.
  • just4U - Wednesday, June 29, 2016 - link

    yeah.. competes with a 970 on some games.. and the reference 980 on others. Was a good card.
  • akamateau - Wednesday, June 29, 2016 - link

    Seriously?

    2 RX 480 in Crossfire CRUSHES GTX 1080!!!!

    rtflol
  • AntDX316 - Thursday, June 30, 2016 - link

    The hype was fake.

    I mean honestly releasing a 14nm flagship slower than their previous gen is a step in the wrong direction. I wouldn't be surprised if they just release an $800 version of the 14nm in early 2017 with masssive power. They just need more time to get their fabrications correct. I assume there could be some unforeseen problems and if the problems do arise with the $200 version it won't leak into the common video card world. It would be kept rare and quiet so that stuff can be fixed for their $800 flagship.
  • slickr - Thursday, June 30, 2016 - link

    This isn't a flagship card you moron! This is a mid range mainstream card, created specifically for the mass market. Their flagship card is coming in 2017, its Vega and its got HBM2, matured 14nm process, 4000+ stream processors, etc...
  • Yojimbo - Thursday, June 30, 2016 - link

    It's not a flagship card. They don't have a new flagship card so they tried to hype their mainstream card with "This is what you really want/need!" The recent trend is for gamers to buy more expensive cards, not cheaper ones, though, so in my opinion unless the economy tanks it's a bad strategy.

    If it performed 20% better or they sold it for $160 instead of $200 it would be all the things they tried to hype it as being. But as it is, it just looks like the first 14/16nm mainstream card to market. No more, no less. It's a solid card but it'll never be that impressive to launch a mainstream card that slots right into what the competition will offer in 1 or 2 months while leaving the rest of the market uncovered.
  • Gigaplex - Thursday, June 30, 2016 - link

    Next you'll be telling us that the NVIDIA 750 Ti was the Maxwell flagship. It came first, that does not make it a flagship.
  • ihatenividiaastheyareaholes - Saturday, July 9, 2016 - link

    heres a thought how about people stop arguaing about this and wake the fuck up to what is going on that being the consoles trying to knock pc of its glorious pedastel
  • IronTed - Wednesday, June 29, 2016 - link

    You sir are a moron.
  • slickr - Thursday, June 30, 2016 - link

    The 390 easily beast the gimped and fraudulent 970 3.5GB trash for LESS money. The trash 970 still costs around $300, in extremely rare cases $280 for only 3.5GB.

    The RX 480 costs $200 or $240, consumes less, has full DX12 support, its cooler, quieter, overclocks a lot more.
  • sonicmerlin - Friday, July 1, 2016 - link

    The reference cards don't overclock at all. If you want an AIB card expect to pay significantly more.
  • cjpp78 - Thursday, July 7, 2016 - link

    weird, my reference xfx Rx 480 black edition came stock overclocked to 1328mhz. I've seen others clock reference to 1350+. Its not a good overclocker but it will overclock
  • Demibolt - Friday, July 1, 2016 - link

    Not here to argue, just fact checking.

    Currently, there are brand new GTX 970 (OC versions, not that it matters) available for $240.
  • hans_ober - Wednesday, June 29, 2016 - link

    A bit disappointing, Perf/W is around GTX 980/970 levels = Maxwell. Not near Pascal.

    For the price? It's a good deal, but Perf/$ isn't as high as it was hyped to be.
  • PeckingOrder - Wednesday, June 29, 2016 - link

    It's a terrible product. Look at the temps.

    http://media.bestofmicro.com/C/E/591422/original/0...
  • figus77 - Wednesday, June 29, 2016 - link

    No one will buy one with the stock cooler. So, why bother for it?
  • Namisecond - Wednesday, June 29, 2016 - link

    I was going to pick one up today (stock model) at my local microcenter, but the temps and noise is making me pause until I get more verification on that...might as well wait for the 1060/ti that might be announced soon...
  • TimAhKin - Wednesday, June 29, 2016 - link

    And the card is not meant for 4K. No card stays cool or at decent temps like 60 degrees, in 4K.
  • JoeyJoJo123 - Wednesday, June 29, 2016 - link

    My GTX 970 does in 4K on Warframe. I disable AA (not need on 4k 24" screen) and other post-processing and lighting effects which get in the way of identifying enemies and allies quickly and efficiently.

    Speak for yourself. I'm getting tired of this whole "current gen video cardz can't HANDLE 4k!". At least define 4k as 4K resolution at maxed out settings on current gen AAA game titles, because at that point, you'd be correct. But just saying 4k can't be handled period is completely stupid and false.
  • eek2121 - Wednesday, June 29, 2016 - link

    People do that all the time. When Xbox Scorpio was announced everyone was saying that there was no way it could do 4K, yet few are paying attention to the fact that xbox one games are designed around a slower version of the r7 260x. As an example, there is no doubt in my mind that the RX480 could easily scale these games to 4k.
  • TimAhKin - Wednesday, June 29, 2016 - link

    Do you understand my comment?
    This card(480) is not meant for 4K gaming and that temperature from the above picture is normal. I get 85-90 degrees as well in 4K.
  • FriendlyUser - Wednesday, June 29, 2016 - link

    Warframe is very, very light on the GPU. I get ~100fps at 1440p with a much older card and almost everything maxed. Try Witcher 3 for a challenge at 4k.
  • Murloc - Tuesday, July 5, 2016 - link

    I can play age of empires 2 @4K on a gtx 275 get on my level
  • Questor - Wednesday, June 29, 2016 - link

    Bandwagon much? One picture and you are already condemning a product that hasn't had a fair chance. You hurt yourself in the long when you subscribe to bandwagon jumping by spreading fanboy-ship; opinions not based on a clear factual completeness, but rather a possible detractor that is as yet unproven across the entirety of the products. Competition serves all of us. It brings prices more under control and forces innovation.
  • mikato - Friday, July 29, 2016 - link

    "It's a terrible product. Look at the temps."

    I don't know about everyone else, but I don't buy my GPUs based on thermal images and point temps. Amirite?
  • poohbear - Wednesday, June 29, 2016 - link

    How is it a bit disappointing??? do you really think most of us are running GTX970s?? The vast majority of people have gtx950 class cards, and this would be a nice step up considering the price.
  • sharath.naik - Wednesday, June 29, 2016 - link

    Its disappointing because, 970 can overclock 10-15% sometimes more. You need to look at the thermals to understand that these are like already overclocked from the factory and cannot do more.
  • smartthanyou - Wednesday, June 29, 2016 - link

    In no situation will a 10-15% overclock ever produce a performance difference that an end user would notice. In a benchmark application? Sure, numbers will increase but frames in a game will not increase to a point to make a difference.

    Overclocking 10-15% in almost all cases is pointless.
  • FriendlyUser - Wednesday, June 29, 2016 - link

    All these are reference and have zero electrical margin for overclock. Reviews have shown that the board uses all juice it can and is almost constantly at the limit (or over) of the PCIe slot power delivery! You will only be able to judge overclock in cards with more complicated designs. The chip itself is probably quite variable, being the first of the 14nm AMD generation. Some will overclock well, others wont.
  • wumpus - Wednesday, June 29, 2016 - link

    Last I heard, 970 was at the top of the steam surveys (I won't enable whatever kludge they wanted to find out). It isn't a bad goal, but my confidence in AMD shipping it to newegg faster than nvidia can ship an as of yet hypothetical 1060 isn't all that great. Assuming they do, it doesn't really mean they have a long window of "the card to buy at ~$200".

    A bigger worry is how many of those 970s are going to be hitting the market. Until AMD can claw back some marketshare, there could easily be a used 970 for every new 480 buyer out there. And this is coming from someone who had been assuming that I would get a 390 (or two) and DYI some watercooling for an ideal VR rig (before prices skyrocketed. I'm guessing lose the watercooling and go with nvidia once both VR and nvidia 14nm prices come back to Earth). This card isn't helping AMD all that much.
  • lunarmit - Wednesday, June 29, 2016 - link

    It is, but it's just one card. Add in 1% for the 980, and 1% for the 980 ti and you have ~90% of the cards are powered below that once you factor in the AMD comparables.
  • Demibolt - Friday, July 1, 2016 - link

    Not here to argue, just fact checking.

    GTX 970 is currently in the top 3 cards used by Steam users (according to Steam's reports).
    The GTX 970 can currently be purchased for ~$240 from several online retailers.
  • TimAhKin - Wednesday, June 29, 2016 - link

    The 480 was never meant to compete with Pascal.
    This card is for people that want to upgrade from older GPUs. It's a pretty cheap card that's good for 1080p and VR.
  • Fallen Kell - Wednesday, June 29, 2016 - link

    This is a horrible card for VR. It can barely do 60 FPS at 1080p, which isn't even in the same ballpark of doing the 90-120 FPS at 2160x1200 that is needed for VR. The card would need to be 3-4x higher performance than it currently is to pull off VR.
  • Meteor2 - Wednesday, June 29, 2016 - link

    No, you need 90 fps minimum for VR and the 480 does it at the resolutions used by the Rift and Vive just fine.
  • DigitalFreak - Wednesday, June 29, 2016 - link

    The 480 does 90fps @ 2160x1200? WTF are you smoking?
  • Fallen Kell - Thursday, June 30, 2016 - link

    I know, seriously. I don't know what Meteor2 or TimAhKin are smoking, as it is obviously the fanboy weed. There is only a single game that was benchmarked at 1080p where the 480 could even hit 90+ fps, and everything else was between 40-70 fps. And VR has 25% more pixels than 1080p.

    I mean, if the 480 was getting 120-130 fps at 1080p, then I would agree that it might be a decent VR card. But it averages about 1/3rd that on current AAA titles (or less with highest video quality settings enabled). This is why I said it needed to be 3-4x higher performance than what it is to do VR. Anyone using this for VR will have to turn down the video quality WAY down, which means you will see much more item/object popping due to object draw distances being so low, which destroys the immersion sense that VR is attempting to create in the first place. Add in the slideshows you will get during effects heavy scenes and get ready for the vomit comet to commence due to the eyes not seeing the results of your head moving.
  • Meteor2 - Thursday, June 30, 2016 - link

    I'm going by the Occulus benchmarks. You know, the one by the people who make the VR headsets.
  • killeak - Thursday, June 30, 2016 - link

    Well, they said that this would be an entry level VR card, and for that I take using low settings but 90fps at VR Resolution, and I think it's possible to consider the RX 480 (just as the 970) an entry level VR card.

    Now, if we are talking high/ultra settings, then no, it won't make it.
  • pashhtk27 - Thursday, June 30, 2016 - link

    And for 90fps on high settings, you have 1080. You sure should smoke cash.
  • Yojimbo - Thursday, June 30, 2016 - link

    You are looking at benchmarks with all settings maxed out. The RX 480 is entirely capable at running 90fps @ 2160x1200 with settings selected more judiciously.
  • TheinsanegamerN - Thursday, June 30, 2016 - link

    So it cant to 60FPS constant at 1080p, but it CAN do 90FPS constant at 2160x1200? Did you fail math?
  • Sushisamurai - Friday, July 1, 2016 - link

    I think the point of yojimbo's post is that it should be able to hit 90fps @2160x1200 at medium to low settings. It can't hit 60FPS at ultra high settings at 1080p
  • Yojimbo - Friday, July 1, 2016 - link

    Yes exactly. Ironically, math is my area of expertise.
  • cocochanel - Thursday, June 30, 2016 - link

    You must be smarter than the engineers at AMD. They said this card was designed for VR, they would not make such a claim if the card did not deliver. 3-4x higher performance ? Where do you live ?
  • CiccioB - Friday, July 1, 2016 - link

    In a world where marketing claims results to be false for most of the times.
    Wasn't Polaris 10 going to have 2.5x efficiency gain vs GCN? An AMD engineer told that as well. And has even put it on a slide.
    I just saw 40% gain. While Pascal gained more than 60% over Maxwell. Which was still 40% better than GCN.
    If an AMD engineer tells you that this card can fly, would you accelerate the fan at the level to try that claim? You know, an engineer has told you that it can! And it was an AMD engineer, nothing less!
  • FriendlyUser - Wednesday, June 29, 2016 - link

    Perf/W would be much better if they had used GDDR5X, which they did not, for cost reasons. HBM is even more power efficient. Then you have the board itself, which probably is not as electrically sophisticated as the much more expensive nVidia 1080 board. Finally, you don't know which of the two process technologies is better for perf/W (two different foundries). In the end, I don't think the chip design is the main difference.
  • Yojimbo - Thursday, June 30, 2016 - link

    The RX 480's perf/W is really no better than the GTX 970, which uses GDDR 5 RAM like the RX 480 as well as a 28nm process compared with the 14nm process of the RX 480. I do think the architecture is the main difference. Polaris 10's architecture seems to be significantly less efficient than Maxwell's, after accounting for the advantage of the 14nm process of the RX 480. Pascal is even more efficient architecturally than Maxwell.
  • Meteor2 - Wednesday, June 29, 2016 - link

    The 1080/1070 take the performance/power crown. But the 480 comfortably takes the performance/price crown. What's interesting is that the 1080 isn't quite fast enough for AAA titles at 4K and the 1070 sits in no man's land, while the 480 runs AAA and 1080p and does VR. It's clear which option is the solid buy.
  • Yojimbo - Thursday, June 30, 2016 - link

    Yes the RX 480 will take the performance/price crown assuming supply can keep up with demand, but for how long? The GTX 1060 will be out in a month or two and be very competitive in price/performance.

    The 1080 is fast enough for AAA titles at 4K if one doesn't max out the settings. A similar thing can be said for the 1070. Also similar is RX 480's VR claim. It can only manage VR gaming when settings are not maxed out. Are you a console gamer or you just have selective memory? This paragraph should be redundant for a PC gamer.
  • Demibolt - Friday, July 1, 2016 - link

    Not here to argue, just fact checking.

    GTX 970 can be purchased for ~$240 from several online retailers (less if you get a used one from ebay). Given the close performance figures between the 2 cards and the inevitable price-drop that will happen with the GTX 970, It is objectively too soon to say the price/performance benefit of one cards beats out the other.
  • Chris A. - Wednesday, June 29, 2016 - link

    To replace my 7850 on my 1080p monitors maxed out at 60 Hz, it's an absolute winner.
  • proxopspete - Wednesday, June 29, 2016 - link

    That's where I am... just need a non-stock cooler
  • catavalon21 - Wednesday, June 29, 2016 - link

    I too have the 7850. I still would like to see some basic compute numbers for both, but yeah, this would handily fill the role for gaming...
  • SunnyNW - Wednesday, June 29, 2016 - link

    Others with 7850s...Did you notice that the FPS numbers for the 7850 seemed a little on the low side... Going back and testing I easily get around 30 fps at 1080p in the games where it was showing in the 20s. Does the choice of OS make a performance difference? Between Win 7/8/10?
  • Laxaa - Thursday, June 30, 2016 - link

    Same here. Now I'm debating if it's wise to go with the 8GB version, or if I should spring for the 4GB one and save money. I'll eventually get a 4K display, but that will mostly be because of work.
  • HollyDOL - Wednesday, June 29, 2016 - link

    eww, that's not really stellar, given the charts now GTX-1060 will likely have it for breakfast...
  • D. Lister - Wednesday, June 29, 2016 - link

    If I were Nvidia, at this point I would probably take half a 1070, factory-OC the bejeezus out of it, add 6GB gddr5, slap on a $150 MSRP and call it a day. :p
  • Chris A. - Wednesday, June 29, 2016 - link

    Remember that die size on the 1070 is 40-50% larger than the RX480, so their margin is going to be smaller to reach that price point.
  • Yojimbo - Thursday, June 30, 2016 - link

    Yeah there's no reason to use GP104 when they have GP106 for that purpose.
  • Ananke - Wednesday, June 29, 2016 - link

    NVidia has never in its history "slapped" $150 when they can put $300 price tag. At best, whenever such thing as GTX1050 happen, it may be around $200 mark for half of this performance. NVidia will never cannibalize its prices, they sell their 1070/1080 with markup easily anyway. There is no reason for them to not have markup on 1060/1050 as well.
  • FriendlyUser - Wednesday, June 29, 2016 - link

    I don't think nVidia wants to compete on price. They'll probably present something equivalent or even marginally (5%) better so that they can "win" then sell it at a significantly higher price point. The price range from $239 to the $400+ of the 1070 has no next-generation products. I'm guessing something will quickly populate the $300-320 price point. Could be wrong, but would make more sense than going to $150.
  • cocochanel - Thursday, June 30, 2016 - link

    I doubt Nvidia can match this card. If they come up with one, they'll have to sell it at a loss.
  • sonicmerlin - Friday, July 1, 2016 - link

    You think Nvidia can't match their 2 year old 970 with 2 node jumps?
  • Sushisamurai - Friday, July 1, 2016 - link

    Thats a good point, I don't think they'll be able to match it, as there's a large R&D cost for developing a new chip and board on process jumps. The fact AMD can recuperate those costs @$200/240 is pretty nuts
  • Questor - Wednesday, June 29, 2016 - link

    Amazing thought comparison to a card that doesn't exist yet. You must be in touch with your inner god.
  • HollyDOL - Wednesday, June 29, 2016 - link

    Well, maybe if you actually read what I wrote before you started seeing red, you would notice word LIKELY :p
  • raazman - Wednesday, June 29, 2016 - link

    Don't worry, it's coming.
  • fuicharles - Wednesday, June 29, 2016 - link

    Provided there is stock available.
  • HollyDOL - Wednesday, June 29, 2016 - link

    hehe, touche :-)
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    I doubt the 1060 will "have it for breakfast", but based on these results I now have little doubt that a card based on a fully enabled GP106 will be performance competitive with the RX480. Rumors and leaks prior to launch suggested that RX480 would trade blows with the 390X, which made me think the 1060 would probably perform a step below it.
  • Yojimbo - Thursday, June 30, 2016 - link

    Why wouldn't you use data for NVIDIA's GPUs to try to determine the GTX 1060's performance rather than use data from AMD's GPUs? The experience from the 700 series and the 900 series implies that, assuming that the 1060 has two GPCs (half that of the 1080) it should be about 20% faster than the 970 in DX 11 games and so about 20% faster than the RX 480. Pascal seems to be doing better in DX 12 than Maxwell, so it may end up being close to 20% faster than RX 480 in DX 12 games, too.
  • dragonsqrrl - Thursday, June 30, 2016 - link

    I'm not using AMDs GPUs to determine the performance of the 1060, I'm using the 1070 and 1080. What I was trying to say in my previous comment was that I've assumed roughly 50% 1080 performance (or around the 970) for the 1060. The RX480 leaks prior to launch suggested 390X-like performance, which led me to believe the 1060 would probably perform a step below it. Apparently the leaks were a bit exaggerated, so I now think the 1060 will be more competitive against the RX480 than I did before.

    I'm actually curious why your estimate is so different. Am I missing something?
  • Yojimbo - Thursday, June 30, 2016 - link


    OK, first let's look at the 900 series. The GTX 980 was released on September 18, 2014 for $549. It has a 2048:128:64 configuration @ 1126 MHz base clock for 4612 SP throughput. The GTX 970 was released on September 18, 2014 for $329. It has a 1664:104:56 core configuration @ 1050 MHz for 3494 SP throughput. The GTX 960 was released January 22, 2015 for $199. It has a 1024:64:32 core configuration @ 1127 MHz base clock for 2308 SP throughput.
    Relative: performance - 980 is 1.32 times 970, price - 980 is 1.67 times 970. performance - 980 is 2 times 960, price - 980 is 2.76 times 960. performance - 970 is 1.51 times 960, price - 970 is 1.65 times 960.

    Now the 10 series. GTX 1080 was just recently released and will presumably be available soon for $599. It has a 2560:160:64 configuration @ 1607 MHz for SP throughput of 8228. GTX 1070 was just recently released and will presumably be available soon for $379. It has a 1920:120:64 configuration @ 1506 MHz for SP throughput of 5783. Now the GTX 1060 is rumored to have a 1280:80:48 configuration. It will probably have a clock very close to the 1080 judging by the 900 series clocks. That would give it an SP throughput of 4114. Relative: performance - 980 is 1.42 times 1070, price - 1080 is 1.58 times 1070. speculative: performance - 1080 is 2 times 1060. performance - 1070 is 1.41 times 1060.

    Now the GTX 1070 has SP throughput that is 25% more than the GTX 980. It performs 20% to 40% faster than the 980 (in DX11 games. More in DX12 games), averaging more than 30% faster. 4114 SP throughput for the GTX 1060 would make it give it 18% more than the GTX 970. It should then average about 25% faster in DX11 games, and so more than 20% faster than the RX 480.

    Now, I know that you were only interested in how I got the performance numbers for the 1060, but I decided to include an argument for pricing as well while I was at it:

    The 1070 has a performance/price ratio of 1.11 wrt to the 1080. The 970 has a performance/price ratio of 1.27 wrt to the 980. The 960 has a performance/price ratio of 1.38 wrt the 980. The 960 has a performance/price ratio of 1.09 wrt the 970. You can see the 1070 is priced a lot closer to the 1080 compared to the 970's price relative to the 980, despite the 970 being closer to the 980 in performance compared with the relative performance of 1070 and 1080. The question is why is this the case? Does it have to do something with the defect density of the 14nm node, or something else? My guess is it has to do with the success of the 970 and the amount of competition from AMD in the space the cards occupy. The 970 was enourmously successful, and NVIDIA wants to push up the average selling price of the replacement card if they can, in order to tap into the prior success of the x70 card. Additionally, when the 980 and 970 were released, AMD had more competition for the 970 than for the 980. Therefore the 980 could be prices relatively higher. Now AMD does not really have competition for both the 1080 and the 1070, allowing both those cards to be priced higher. The 1060, however, faces competition. Therefore I think that the expected pricing of the 1060 would be to remain close to the relative price/performance ratio of the 960 wrt the 980, a card with competition compared with a card without much competition, rather than remain closer to the relative price/performance ratio of the 960 wrt the 970, which were both cards with competition. If we divide the price ratio of the 980 to 960 with the performance ratio of the 980 to 960 we get 2.76/2 = 1.38. This represents a conversion factor that will convert relative performance to relative price, under the assumption that the relative price performance ratio of the 980 to 960 also holds for the 1080 to 1060. Since I speculated that the 1080 will have 2 times the performance of the 1060, the 1080 would then cost 2.76 times the 1060 under these assumptions. Since the 1080 costs $599, the 1060 would be expected to cost about $217.

    $217 obviously leaves quite a bit of wiggle room for upward pricing pressure of the 1060, such as it falling closer in line with the 1070 price for whaetever reason, and still be well below the $300 that many seem to be claiming. But the point is that a $220 GTX 1060 performing 25% faster than the GTX 970 is well within the range of reasonable expectations given the recent historical data of NVIDIA's cards. If anything the GTX 1080 has even less competition than the GTX 980 had, suggesting the converstion factor might actually be greater (But I doubt it. The 1080 can't be found for the $599 at the moment and part of the reason for that is that the 1080 doesn't have any competition. That larger conversion factor is factored into the actual real world prices but probably not the MSRPs.) So the RX 480 seems to exert no extra pricing pressure on the GTX 1060 than AMD's offerings exerted on the GTX 960 when the GTX 960 was released.
  • dragonsqrrl - Thursday, June 30, 2016 - link

    "Relative: performance - 980 is 1.42 times 1070, price - 1080 is 1.58 times 1070. speculative: performance - 1080 is 2 times 1060. performance - 1070 is 1.41 times 1060."

    There's something wrong here. If the 1060 is roughly equal to the 980, and the 1080 is 2x the performance of the 1060, the 1080 would also have to be 2x the performance of the 980, which it isn't. I'm not exactly sure where the ratios or logic went wrong, but there's clearly an inconsistency there. The 1080 is about 1.65x the performance of the 980, and about 1.95x the performance of the 970. I'm not using theoretical SP performance, I'm basing this primarily off of real world DX11 performance at 1440p. This is why I assumed it would perform closer to the 970, because it's roughly 50% the performance of the 1080.
  • Yojimbo - Thursday, June 30, 2016 - link

    " If the 1060 is roughly equal to the 980, and the 1080 is 2x the performance of the 1060, the 1080 would also have to be 2x the performance of the 980, which it isn't."

    There's pretty obviously a typo there. The organization of the information should lead you to know it's a typo. It should read: ""Relative: performance - 1080 is 1.42 times 1070, price - 1080 is 1.58 times 1070. speculative: performance - 1080 is 2 times 1060. performance - 1070 is 1.41 times 1060." Does that clear things up?

    " The 1080 is about 1.65x the performance of the 980, and about 1.95x the performance of the 970."

    Does that contradict my information? If it does, then show how. It doesn't seem relevant to me, because you don't directly argue against the cross-generational comparison I did make. I established the relative performance of the 10 series to the 900 series by comparing the 1070 to the 980. The 1070 performs on average 30% faster than the 980 in real world games. The relative performance of the cards within their architecture is closely related to their theoretical performance.

    "I'm not using theoretical SP performance"

    Without considering theoretical performance there's no way whatsoever you can guess the performance of the 1060 because at this point theoretical performance of the 1060 is all we have information for.
  • sonicmerlin - Friday, July 1, 2016 - link

    Yojimbo almost certainly has Aspergers. And yet you read everything he wrote. Jesus
  • Yojimbo - Friday, July 1, 2016 - link

    Why wouldn't you want to read something that's right? Jesus
  • crimson117 - Wednesday, June 29, 2016 - link

    Go away, troll. Don't just post shit like that without backing it up. When can we get downvote buttons on AT comments?

    "Wrapping things up then, today’s launch of the Radeon RX 480 leaves AMD is in a good position. They have the mainstream market to themselves, and RX 480 is a strong showing for their new Polaris architecture. AMD will have to fend off NVIDIA at some point, but for now they can sit back and enjoy another successful launch."
  • atlantico - Wednesday, June 29, 2016 - link

    He's not wrong, for $200-240 the best GPU on the market is AMD RX480. For "backing that up" check the benchmarks in the article.
  • crimson117 - Wednesday, June 29, 2016 - link

    I was referring to the OP's comment "What a massive F-Up by AMD"
  • stereopticon - Wednesday, June 29, 2016 - link

    the gtx 970's price could easily be dropped to 240 to compete
  • smilingcrow - Wednesday, June 29, 2016 - link

    In the UK the 970 is already cheaper than the RX 480 8GB if you shop around which is where it should be priced I think.
    The RX 480 only has one redeeming feature but that's an important one; PRICE.
  • ptmnc1 - Wednesday, June 29, 2016 - link

    Well, it *could*, I suppose. But the 970 is a 398mm² chip and the 480 is a 232mm² chip so AMD can make 1.7x as many from the same wafer even before accounting for yields. With similar memory systems there's no way for nVidia to win a price war at this performance level with its 28nm parts.
  • DigitalFreak - Wednesday, June 29, 2016 - link

    AMD has no monies to fight a price war. Nvidia could drive them out of the market right now if they wanted to. They won't because of monopoly concerns.
  • ptmnc1 - Wednesday, June 29, 2016 - link

    nVidia can't flog 970s below cost without various anti-dumping laws kicking in as well as the risk of having their company disassembled. Plus AMD would still be making a profit at that point.
  • Yojimbo - Thursday, June 30, 2016 - link

    I'm pretty sure that NVIDIA wouldn't have to sell the 970 for below cost to use it to compete or even undercut the current MSRP of the RX 480. NVIDIA's profit margins are significantly higher than AMD's to begin with. If NVIDIA prices the GTX 970 at just below the price as the RX 480, they probably sell more of them than AMD sells of the RX 480s. A price of the 970 GPU to AIB partners which would allow them to sell their card for $180 should not be below cost. I really don't know how such accounting works, but I am guessing that most of the research, development, and validation costs have already been amortized. Production cost of the chip really is not that much, comparitively. Surely regulators must recognize a cost advantage of a long-run production of a product.
  • fanofanand - Thursday, June 30, 2016 - link

    I'm not sure anti-dumping laws apply to obsolete (based on not being the latest) hardware. How else would a company be expected to clear out existing inventory prior to a product refresh?
  • ptmnc1 - Thursday, June 30, 2016 - link

    Yojimbo: The 970 has almost exactly the same memory system as the 4GB 480, the only differences in manufacturing costs are going to be the other parts of the board (e.g. VRMs) and the chip itself. The principal cost difference is going to be the chip, and when it's a 232mm² chip vs a 398mm² chip, the latter is going to cost a lot more to make even if yields on Samsung's 14LPP aren't as great as TSMC's 28nm: as long as they're not catastrophically worse (and there's no indication that's the case), there's just no way to build a 970 cheaper than a 4GB 480.

    On monopolies, you don't have to abuse one in order to attract a lot of undesirable attention from regulators.

    fanoffanand: There's a difference between clearing existing obsolete inventory and deliberately manufacturing new inventory at a loss for the purpose of pushing a competitor out of business.
  • Yojimbo - Thursday, June 30, 2016 - link

    The major cost of a chip is research, development, testing, and validation, not manufacturing costs. I assume these costs must be amortized over the life of the product. I am guessing that NVIDIA has already mostly done so, because they have sold a whole lot of chips already. You are worried about dumping, well how can the government come in and say NVIDIA is selling below cost when their research, development, testing, and validation costs logically apply over a run of 2 years, during which they already sold a high volume of chips?

    Regardless of whether it costs more or less for NVIDIA to make a 970 than AMD an RX 480 (I agree that the AMD RX 480 production costs should be lower), I feel confident in saying that NVIDIA doesn't need to sell at below cost to sell a 970 at a price below $200.

    "On monopolies, you don't have to abuse one in order to attract a lot of undesirable attention from regulators."

    Regulators don't just sit there and watch everything. NVIDIA would have to be sued for abuse of power, i.e., some company would need to be willing to spend money to oppose NVIDIA. If NVIDIA does not abuse its market position they shouldn't be too concerned with losing such a suit. If they don't plan on making any aquisitions in the GPU space then i think they don't need to worry about regulators. I'm mainly just repeating what I already said here, but you aren't saying what sort of "undesirable attention" you see NVIDIA getting or how it will affect them, so I'm not sure what else I can do.
  • fanofanand - Friday, July 1, 2016 - link

    Nvidia isn't currently manufacturing new 970s.
  • Yojimbo - Thursday, June 30, 2016 - link

    What monopoly concerns? It's not illegal to have a monopoly, it's illegal to abuse a monopoly. other than that a monopoly position can affect regulatory rulings concerning mergers and acquisitions, but I doubt NVIDIA has any ambition to make a purchase of a GPU maker, so I doubt they would have any regulatory concerns. The important reason NVIDIA won't try to drive AMD out of the market is because they are interested primarily in increasing their profits and not with driving AMD out of the market. If NVIDIA can get 95% market share and maintain their profit margins they would be very happy, unconcerned with having "too much market share", providing they could achieve it without engaging in uncompetitive practices.
  • cocochanel - Thursday, June 30, 2016 - link

    I never knew Nvidia to be much concerned about monopolies. Over the years, my impression was that they only care about profit margins. And they are good at it.
  • Yojimbo - Thursday, June 30, 2016 - link

    Yes the RX 480 may have a cost advantage over the GTX 970, but the point is that AMD doesn't have the market completely to themselves since the RX 480's advantage over the GTX 970 is dubious. NVIDIA may have smaller profit margins in the space but it's not like they are uncompetitive in the space. The GTX 1060 will arrive soon enough to restore NVIDIA's profit margins. In the mean time inventories of the GTX 970 can be flushed out of the system for a profit.
  • Questor - Wednesday, June 29, 2016 - link

    "When can we get downvote buttons on AT comments?"

    I get your point, really I do! Be careful what you ask for. Heaven forbid you should say anything of merit over at TH. The foundations of civilization shake when you question a review(er) and the fanboys rules supreme with their mouse cursor over those little clickable arrows. You can say something that is completely true, accurate, responsible and even polite, but beware should you offend a minion! They and their brethren will pounce upon your words of wit and wisdom with the fury the scorned. Your post, feelings, opinions, facts, questions and whatever else you said, will be buried so deep, not even Hades will be able to dig it up!
  • JoeyJoJo123 - Wednesday, June 29, 2016 - link

    You can go back to reddit and enjoy your inner-circle and upboat eachother to make yourselves feel good.

    Proper internet forums of speech aren't saddled by prominently displaying the most popular opinion. Everyone's post should be equally as worthless.
  • AntDX316 - Thursday, June 30, 2016 - link

    How do you get a blue post?
  • pashhtk27 - Thursday, June 30, 2016 - link

    "Everyone's post should be equally as worthless."
    Nice. ;)
  • ddriver - Wednesday, June 29, 2016 - link

    The rx480 is targeted in the market niche that has the best sales to profit margins ratio. It is about as fast as the gtx970, but is more efficient and better performing at new and upcoming games (vulkan, actual dx12 (not dumb ports)). In property optimized games (I mean not games nvidia pays to be left unoptimized for radeons) it is as fast as the r9 nano.

    I'd say job well done. A very efficient and well targeted launch. It would not be possible to do any better given amd's lack of resources, any higher expectations would be unrealistic and the product of genuine cluelessness or fanboyism.
  • smilingcrow - Wednesday, June 29, 2016 - link

    More efficient by a negligible margin but it is good value; a Radeon Lidl 480. :)
  • sonicmerlin - Friday, July 1, 2016 - link

    Really? AMD advertised a 2.8x increase in performance per watt with Polaris. The card massively failed AMD's own expectations.
  • sonicmerlin - Friday, July 1, 2016 - link

    Job not well done. Doesn't come close to reaching AMD's advertised 2.8x performance/Watt improvements. The mass market $200 reference board is drawing power over PCIe outside of spec. If Nvidia comes out with an overclockable 1060 for $250 no one is going for the hot and dangerous 480 over the power sipping 1060. And regardless of what some people claim, even 3 GB of VRAM is plenty for 1080p.
  • mickulty - Wednesday, June 29, 2016 - link

    >50% performance jump over 380X, I'll take that.
  • sonicmerlin - Friday, July 1, 2016 - link

    Even with the benefit of 2 node shrinks?
  • Geranium - Wednesday, June 29, 2016 - link

    By your logic 45% to 70% performance improvement over previous generation is F-up? LOL.
  • Geranium - Wednesday, June 29, 2016 - link

    wrong reply. It was for first comment.
  • MATHEOS - Wednesday, June 29, 2016 - link

    Agree
  • dustwalker13 - Wednesday, June 29, 2016 - link

    a massive F-up?
    only if you compare a 200,- card to a 500,- or 700,- and expect the same performance. this card sits right in the sweet spot of performance and efficiency for a really nice price.
    granted it is not for me, but i am one of those crazy people who shell out 500,- or more for a graphics card - this puts me in the top roughly 5% of gamers i would suspect, the rest will buy cards below 300,- and there the 480 delivers extremely well.

    no it is not a high-end card, but then it never was supposed to be a 1080 killer.

    the interesting question now will be what the 1060 will deliver in terms of price/performance/efficiency.
  • Byte - Wednesday, June 29, 2016 - link

    Ouch AMD can't catch a break. Promised lower power consumption, but seems to be surpassing it, possibly blowing out the PCIE. Performance is about what expected, but won't turn heads. Looks like the GPUs are following CPUs succession in disappointment. Lets hope Vega will be a stunner and nVidia won't have a 1080Ti in time to rain on it like they did with Fury. We need to give AMD a surviving chance!
  • cocochanel - Thursday, June 30, 2016 - link

    With people like you, how could they get a break ?
  • Frenetic Pony - Wednesday, June 29, 2016 - link

    And yet they'll make bank off of it. Learned over the past few years that GPU quality and sales have little to do with each other. Nvidia made a ton of money off their last generation despite the fact that no desktop user should give much of a shit about TDP, but it worked anyway despite AMD beating them price for perforrmance in almost every category. Similarly this card sucks while Pascal is quite impressive, but this all the good will and PR in the world so will sell like hotcakes anyway.
  • akamateau - Wednesday, June 29, 2016 - link

    Hardly!!!

    2 RX 480 in CROSSFIRE mode outperforms GTX 1080 for $200 less!!

    NVidia F-Ked up.

    BUY AMERICAN!!
  • TheinsanegamerN - Thursday, June 30, 2016 - link

    Only if crossfire workes at 100% which is a big IF, and even then it will be drawing twice the power of a 1080 and putting out way more heat, while barely performing any better.

    Yeah no thanks.
  • akamateau - Wednesday, June 29, 2016 - link

    Hardly!!!

    2 RX 480 8gb outperforms GTX 1080 for $200.00 less.

    rtflol

    BUY AMERICAN!!
  • WhisperingEye - Thursday, June 30, 2016 - link

    Nvidia is an American company. So is AMD. Not that difficult to do if you're in the market for a discreet GPU. You come across as a moron, and like a moron, you'll pass over this criticism and continue whatever quest you're on.
  • Einy0 - Wednesday, June 29, 2016 - link

    Yup a massive mess up, $100 less than my GTX 970 for the same performance level. The mainstream price points are where the money is made everyone knows that... They will sell tons of these. The only question is how will the GTX 1060 stack up and what price will it be?
  • fanofanand - Thursday, June 30, 2016 - link

    There is more to an architecture than CU's, GCN 4 brings a few new technologies your 970 don't have.
  • TheinsanegamerN - Thursday, June 30, 2016 - link

    You mean stupid high power consumption for a 14nm card and high heat levels? Such amazing features.

    As far as DX12 and async are concerned, they will not save AMD. AMD needs to stop relying on magic bullets and actually release competitive hardware first.
  • fanofanand - Friday, July 1, 2016 - link

    A couple of % in certain games is "stupid high power consumption"? I see you have tied your livelihood to Nvidia's success but whoa!
  • AbbieHoffman - Wednesday, June 29, 2016 - link

    This is a 80 series card! And it is much faster than the 380 it's replacing. Why do you have to explain everything to people nowadays? I swear anyone born after 1987 would have been considered legally retarded in the 1980's! Thank god for fascist political correctness hu? Your minimally exceptional!
  • TheinsanegamerN - Thursday, June 30, 2016 - link

    And it matches a 2 year old 28nm card. How amazing.

    The 1060 will crush this thing.
  • AntDX316 - Thursday, June 30, 2016 - link

    NVidia was talking about releasing the 1080 GTX for laptop instead of 1080M because they were scared AMD was going to come out big.

    The hope for massive gains are a lower expected price point is gone and the money milking will continue.
  • sonicmerlin - Friday, July 1, 2016 - link

    AMD has again ceded the entire laptop market to Nvidia thanks to these horribly inefficient cards.
  • medi03 - Thursday, June 30, 2016 - link

    Comparing it to different tier card of current gen is childish or plain stupid, if coming from grown up.

    Performance is right on track.
    Power consumption is disappointing, although not a problem for this tier.

    I'm curios what it is caused by, could well be Samsung 14nm vs Glofo 16nm. I recall Apple's chips from Samsung were consuming more power than from TSMC, despite that one would expect it to be other way round.
  • cocochanel - Thursday, June 30, 2016 - link

    In what world do you live ? Here in Canada, the Nvidia GTX 1080 sells for over 900 dollars Canadian !! The GTX 1070 is over 600 dollars. An RX480 with 8 Gigs of RAM it's about 300 dollars.
    I'll get one for Christmas and that leaves me plenty of change to buy an Oculus Rift set. How is that such a F-up ?
    The stupidity on your behalf is astounding.
  • cocochanel - Thursday, June 30, 2016 - link

    RX480 + Oculus Rift Set = GTX1080. Pricewise. That's a good F-up for me. Christmas is looking good.
  • steamerSama - Thursday, June 30, 2016 - link

    Many people have this twisted expectation. That a 200$ card will outperform something twice its cost is ridiculous. AMD has made a blunder, yes, by not releasing their flagship 490 first and come up with economy class 480. But other than that, if you were expecting a better performance than this, you were perhaps being sucked in by the hype
  • loguerto - Wednesday, June 29, 2016 - link

    Let's see how all of this nice stuff transforms into performance ...
  • rtho782 - Wednesday, June 29, 2016 - link

    I think at this point I'd be more shocked had you posted a full review ;)

    So we're now waiting on GTX 960, GTX 1080/70, and RX 480....
  • justaviking - Wednesday, June 29, 2016 - link

    We now have TWO QUESTIONS to ask every time a new GPU comes out:
    1) Can it play Crysis?
    2) Where is the full GTX 1080 review?
  • Ryan Smith - Wednesday, June 29, 2016 - link

    "2) Where is the full GTX 1080 review?"

    The full RX 480 review will be in a few days. The full GTX 1080 review will be a couple of days after that. RX 480 would have been a full review today, but I managed to slice myself with the RX 480 on Tuesday and needed to resolve that first...
  • fanofanand - Wednesday, June 29, 2016 - link

    Hallelujah!
  • Geranium - Wednesday, June 29, 2016 - link

    Get well soon. Thanks for including R9 380X in preview.
  • fanofanand - Wednesday, June 29, 2016 - link

    Not to the slicing part, but that these long awaited reviews are nearly here!
  • Scali - Wednesday, June 29, 2016 - link

    Careful when handling cutting-edge videocards!
  • fanofanand - Thursday, June 30, 2016 - link

    Golf clap for you, very clever. :)
  • HollyDOL - Wednesday, June 29, 2016 - link

    If that is the order they'll come you have -1 reader. Really, given that 1080 is available in second hands now (as posted by Ian week or two ago) and you put review to card that was released today first makes one doubt about reviewer objectivity... Sadly, after years of having AT as super primary source of HW reviews...
  • bigboxes - Wednesday, June 29, 2016 - link

    Buh-bye
  • fanofanand - Thursday, June 30, 2016 - link

    Don't let the door hit ya Holly, the last thing Anandtech needs are more Nvidiots running around here. If you think Anandtech is biased against Nvidia, you need to go back to Toms.
  • TheinsanegamerN - Thursday, June 30, 2016 - link

    He has a point though. Why wait until a month after release, and after the 480 review, to finally review the 1080, especially since they dont compete against each other?
  • cocochanel - Thursday, June 30, 2016 - link

    Whoever invented the word Nvidiots should get an award. An Nvidia mascot !
  • Yojimbo - Thursday, June 30, 2016 - link

    I don't think it has to do with objectivity, it has to do with relevance. The relevance of a review probably goes down quickly after release date. AT dropped the ball on 1080 and 1070. They don't want to drop the ball on the RX 480 as well.

    I understand your concern but I think unless there is a consistent bias towards AMD it's misplaced. This is just a single matter of practicality. I definitely can see being (and am myself) annoyed by the 1080 and 1070 reviews taking so long.
  • HollyDOL - Friday, July 1, 2016 - link

    Um, not bias towards AMD, but rather they skip on biggest gpu performance jump in years and rather focus on card with medium performance (although, admittedly likely with biggest FPS/$ leap in years). I would be equally disturbed if the GTX-1080 perf bar card was released by AMD and they just skip it. Skipping Skylake cpu line would have prolly smaller impact than this technology wise.
  • Yojimbo - Friday, July 1, 2016 - link

    It's taking them entirely too long to get it out the door, but they aren't just skipping it. It's coming, so Ryan says.
  • HOOfan 1 - Wednesday, June 29, 2016 - link

    So, my burning question is.....did sacrificing your blood make the card any faster?
  • Mr Perfect - Wednesday, June 29, 2016 - link

    Will the 1070 be reviewed in the 1080 article too? Or is that coming later?
  • bill44 - Thursday, June 30, 2016 - link

    I hope I'm wrong about this, but there wont be a full FULL review until it includes the audio architecture (inc. sampling rates supported).
    It will be totaly game orientated, with barely a mention of madvr performance and decoding/encoding capabilities. Just like all the other reviews.

    As I said, I hope I'm wrong about this, and it will be a FULL review.
  • Sandcat - Tuesday, July 19, 2016 - link

    20 days later and still waiting for the RX 480.

    Stop lying, you aren't doing the 1080/1070 at all.
  • cocochanel - Thursday, June 30, 2016 - link

    There are plenty of decent reviews on the GTX 1080 already.
  • idris - Wednesday, June 29, 2016 - link

    I'm wondering how 1070FE benches were added to this "preview"?! Disappointed with AT..
  • Ryan Smith - Wednesday, June 29, 2016 - link

    Just so it's noted, 1070FE benchmarks have been in Bench since late May.
  • fanofanand - Wednesday, June 29, 2016 - link

    You are commenting on the RX 480 article you are waiting for. Maybe you are looking for a deeper dive, but give em' a break. This is already more in-depth than you will get from 90% of the other review sites.
  • nagi603 - Wednesday, June 29, 2016 - link

    So much for an upgrade from a 290X.... it isn't even a side-grade at this point.
  • evolucion8 - Wednesday, June 29, 2016 - link

    If it was an upgrade from the 290X, wouldn't be called RX 490 instead?
  • nagi603 - Wednesday, June 29, 2016 - link

    No, it would be called 390X then. You know, the 480 is two steps ahead...
  • evolucion8 - Wednesday, June 29, 2016 - link

    Wrong, 290X replacement was Fury X, and the 280X replacement was the 390X, the issue with the naming scheme was to make space for the Fury X naming which messed up the whole naming convention. Now they are using it back like on the 200 series days. Its clear that AMD is targeting $200 buyers, not $300+ buyers on the likes of the GTX 970, 980, 390, 390X and so on.
  • WhisperingEye - Thursday, June 30, 2016 - link

    Newegg is selling an EVGA 970 ACX 2.0+ for $250 dollars right now. Now which buyers are we targeting again?
  • mikato - Friday, July 29, 2016 - link

    Can you spell out your point a little better? And please use a useful comparison... since, you know, I don't think you have one here.
  • Syran - Wednesday, June 29, 2016 - link

    Small mistake early on in the article, in listing the specs, it shows the 4GB card with 8GB of vram.
  • Ryan Smith - Wednesday, June 29, 2016 - link

    Thanks!
  • watzupken - Wednesday, June 29, 2016 - link

    The performance is rather underwhelming to be honest. It may be a little early to conclude with the state of the driver for this new card. Still I feel its kind of just performing at the level of a R9 390 in most cases and is saved by the aggressive pricing.
  • xthetenth - Wednesday, June 29, 2016 - link

    It's a small chip and cheap card, so the design is delivering on the goal of making that level of performance cheap.
  • fanofanand - Wednesday, June 29, 2016 - link

    The problem as I see it, is that it's barely cheaper than a 970 that performs similarly. I get the whole 3.5 Gb issue with the 970, but based on those charts they are neck and neck with the 970 often beating it. Maybe my expectations were out of whack, but I had really hoped that AMD would be offering 970 performance for 960/950 pricing, given the updated node.
  • Drumsticks - Wednesday, June 29, 2016 - link

    I'm not expecting an /upgrade/ from the 390, but any insight into why the 480 barely beats the 390 despite 10% more shaders? Where are all of the uarch changes going to? Is it a lack of ROPs? That's about the only thing I can think of. Performance at 1440p seems fairly eh.
  • watzupken - Wednesday, June 29, 2016 - link

    I think its ROP starved. Which is typically the case for cards in the mid range.
  • zoxo - Wednesday, June 29, 2016 - link

    32 ROPs are very much more at home at 1080p. U have some leeway with frequency, but 1440p is a big jump in pixel count.
  • D. Lister - Wednesday, June 29, 2016 - link

    Yes, 32 ROPs vs 64 ROPs of the 390. It really only starts showing at >1080p resolutions though.
  • extide - Wednesday, June 29, 2016 - link

    480 has LESS shaders than 390.
    480 - 2304
    390 - 2560
    390X - 2816
  • smackosaurus - Wednesday, June 29, 2016 - link

    So no 980 in the chart?
    Would be interesting to see a comparison in DX12 with the 480 and 980, but somehow after week of people saying the 480 was near 980 levels in DX 12...the 980 was somehow left out.
    Great job.
  • Ryan Smith - Wednesday, June 29, 2016 - link

    You can find that data (and more) in Bench: http://www.anandtech.com/bench/product/1748?vs=171...

    Otherwise 980 isn't in these charts as it's not really a meaningful comparison. Retail sales have already started winding down, and in terms of performance the RX 480 averages just 3% ahead of GTX 970. It's not a 980-level card.
  • warreo - Wednesday, June 29, 2016 - link

    I have to disagree with you Ryan. People (unfairly or not, just read the all the comments) expected this to land somewhere between 970/980, so I really am not sure how you can say it's "not really a meaningful comparison." To me, it's good to know that it's basically equivalent to a 970, but also useful to know that it's on average 15-20% slower than a 980 by extension.

    Again, I could (and did) look it up on the Bench, but it would have been useful to have in the charts off the bat.
  • Yojimbo - Wednesday, June 29, 2016 - link

    Why? Who is going to switch from a GTX 980 to an RX 480 for performance in two DX12 games? I don't think those two DX12 titles can accurately be thought of as being fully representative of DX12 performance, especially since they seem to favor AMD cards to begin with, so it doesn't show the sort of categorical comparison you are implying.
  • smackosaurus - Wednesday, June 29, 2016 - link

    Well if there are more like me out there and have several family members each with their own PC.. cards have to last a couple of years AND be affordable for the average joe. Since DX 12 is replacing DX1 and every major title announced lately and probably every major title in the future will be Dx12.. the Value of the 480 sticks out. Especially since ever the flagship 1080 has to use software emulation for some DX12 features, because Nvidia decided to just plain leave it out of the hardware. Preemption=/=async
  • Yojimbo - Thursday, June 30, 2016 - link

    I don't think it's certain that DX12 will completely replace DX11, even in major titles. But even if so, that does nothing to change the fact that the number of DX 12 titles available now to benchmark is quite small, resulting in a small sample size. With a small sample size one is not able to make the broad inferences you would like to make. The 980 is likely to be discontinued anyway shortly after the 1060 comes out, completely destroying any reason at all for having it in the charts.

    Maxwell does have hardware support for asynchronous compute. Pascal has an enhanced version of it. async != ACE. ACEs are task schedulers which are used in AMD's method of supporting asynchronous compute. I have a feeling that the idea that NVIDIA does asynchronous compute "in software" has to do with the fact that NVIDIA was working on driver optimizations to try to make the asynchronous compute implementation in Ashes of the Singularity show benefit on NVIDIA's hardware. I'm not sure if NVIDIA ever achieved that or if they've given up or what, but to my understanding NVIDIA was turning the feature off in their driver profile for Ashes of the Singularity because with Oxide's implementation and NVIDIA's method asynchronous compute actually caused the game to run more slowly than it did without it. Again, one game and one implementation is a small sample size. It doesn't tell you much on its own. Independent testing showed there were situations, even on Maxwell hardware, where NVIDIA's method produced a larger speed boost from asynchronous compute than AMD's method. Another thing to consider is that the speed boost that AMD gets in DX 12 with asynchronous compute has something to do with the fact that AMD tends to make less efficient use of their compute throughput than NVIDIA without asynchronous compute. Finally, considering that AotS was designed from the beginning as a Mantle game, it isn't just one data point, it's also perhaps not a very reliable one.

    For more information on asynchronous compute in Pascal, perhaps this video will be informative:
    https://www.youtube.com/watch?v=Bh7ECiXfMWQ
  • cocochanel - Thursday, June 30, 2016 - link

    DX12 is the latest and the best graphics API Microsoft has ever made. All future games, be they for the PC or Xbox One will use it. How can anyone sell a card that performs poorly on it ? And sell it for 800+ dollars ?
  • Yojimbo - Thursday, June 30, 2016 - link

    The GTX 1080 does not perform poorly on DX12 games. What benchmarks have you been looking at?!
  • cocochanel - Saturday, July 2, 2016 - link

    http://www.pcworld.com/article/3071037/hardware/nv...
  • crimson117 - Wednesday, June 29, 2016 - link

    Typo: For the 480 4GB model, VRAM is listed as 8GB in the table on page 1,
  • webdoctors - Wednesday, June 29, 2016 - link

    I called it a month ago when the new cards starting coming out.

    The $200 market is really competitive and you're competing against not just the current cards but previous generations one level up. You're already seeing GTX970 cards at the same pricepoint after rebate in the USA as this "new" RX 480 card.

    Will have to wait for the 1060 review, but it'll likely wipe this card out of the landscape since this one only matches Maxwell Perf/W and perf is only GTX970 levels...win for consumers but not so much for AMD.
  • maccorf - Wednesday, June 29, 2016 - link

    Love comments like this...so a $330+ card with no currently known benchmarks will "likely wipe" a $240, well performing card "out of the landscape"? That is an entire jump in cost market, and you don't even know what the 1060 will do. "I called it..." LOL called what? This card being the new standard for performance and price? The only reason those GTX970 cards are coming down are because of the RX 480. People are so absurd with this, is it even possible for you trolls to ever admit you're completely and utterly biased?
  • mdriftmeyer - Wednesday, June 29, 2016 - link

    It's not possible for these grown children to act like adults.
  • warreo - Wednesday, June 29, 2016 - link

    Who said the 1060 will be $330+? Source?

    If it follows the previous gen's price differential to the 1070, it will be priced at $250...your argument is even more absurd and biased than webdoctors'. Pipe down.
  • K_Space - Wednesday, June 29, 2016 - link

    You're both "right" in inverted commas because we don't have hard facts yet. But if you believe the green leaning WCCFtech, the GTX 1060 will launch at $250:
    http://wccftech.com/nvidia-gtx-1060-special-launch...
    As its rumoured to launch with 3Gb and 6Gb, I suspect these will come in $199 and $250 respectively. 3Gb strikes me as quite low, not sure if it'll affect 1080p performance though. Either way, potential gamers will see 3Gb versus 4Gb and grab the 480 (ditto for the bigger versions).
    TL;DR I doubt the 1060 will "wipe the 480 out of the landscape"
  • fanofanand - Thursday, June 30, 2016 - link

    The 1060 3 Gb being $199 is awfully optimistic. Nvidia is like Intel, they REALLY like their profit margins.
  • D. Lister - Wednesday, June 29, 2016 - link

    In the 480's defense, it offers GTX 970+ performance (esp. In dx 12), plus an extra 4GB of VRAM, while consuming less power. If both were OCed, the 970 would probably beat it in most benches, but c'mon, it is a mainstream product, and most people who already have a 970, paid much higher than what it goes for now.
  • smilingcrow - Wednesday, June 29, 2016 - link

    The 480 consumes slightly less when gaming but a lot more at idle so overall I'd say it loses in terms of power efficiency. Good value all the same.
  • just4U - Wednesday, June 29, 2016 - link

    Wouldn't this new process have a ton more overclocking headroom than 28mm?
  • D. Lister - Thursday, June 30, 2016 - link

    Well, you get what you pay for. Think of the 480 as a fully factory-overclocked GPU, that is for people who haven't even heard of overclocking, but want to play modern games at 1080p/60FPS, while having to pay the minimum for that tier of performance.
  • Eden-K121D - Wednesday, June 29, 2016 - link

    Overhyped shitty card by AMD.What a brilliant F**K up by AMD
  • D. Lister - Wednesday, June 29, 2016 - link

    C'mon now, they told everyone well in advance that it would be ~$200 and target the mainstream. For that, it offers extremely reasonable performance. If you were expecting a 1070-beater for $200 then, with all due respect, the problem is with your expectations, and not this product.
  • Notmyusualid - Wednesday, June 29, 2016 - link

    Indeed.

    Haters are just gonna hate.
  • Eden-K121D - Thursday, June 30, 2016 - link

    Ignorant people will ignore.
  • fanofanand - Thursday, June 30, 2016 - link

    That doesn't even make sense. Ignorant people will ignore?
  • Eden-K121D - Thursday, June 30, 2016 - link

    I was expecting GTX 980 level performance
  • D. Lister - Thursday, June 30, 2016 - link

    :) Marketing hype should be taken with a grain of salt, especially from AMD (Bulldozer, Mantle, Fury overclocking, to name a few). Besides, the announced MSRP was a good indication of things to come. Nothing in life is for free, right ;)? Anyway, the AiB cards with better cooling and an 8-pin power connector would be available soon. With some OC, you could be hitting very close to a 980.
  • cheshirster - Wednesday, June 29, 2016 - link

    230mm2 is almost as big as 1070 (75% of 1080)
    But the result is slower than 970.
    Good luck with hyping Zen, AMD.
  • The_Countess - Wednesday, June 29, 2016 - link

    it doesn't work that way, card consists of more then just shaders. and the 1070 is twice the price!

    and other reviewers put its aggregate score above a gtx970OC.
  • Yojimbo - Wednesday, June 29, 2016 - link

    I think you mean a GPU consists more than just shaders? It's true and his comparison isn't exact, but it can't be completely discounted eitger. NVIDIA has a real world performance per die area advantage over AMD. Probably mostly because they have a real world performance to peak theoretical performance advantage over AMD. DX12 seems to reduce the advantage somewhat but even in DX12 NVIDIA's advantage in that area seems significant.
  • Cygni - Wednesday, June 29, 2016 - link

    The 1070 isn't 75% of the 1080 in size, its the exact same as the 1080 in size. It's the exact same base chip with disabled portions.

    When people talk about 'cut down' versions, did you really think they are physically cutting up the chip?
  • K_Space - Wednesday, June 29, 2016 - link

    lmfao!
  • fanofanand - Thursday, June 30, 2016 - link

    Seriously, I would be so embarrassed to post on a site like this with absolutely no knowledge of that which I speak. Trying to get technical about die size while knowing zilch, I am embarrassed for you chesh!
  • zmeul - Wednesday, June 29, 2016 - link

    people measured the power draw of the the RX480 - 170W: https://youtu.be/eUiaJXLoKnE?t=223 <-jump to 3:43
    a GTX1080 has a power draw (measured) of ~180W and performs miles away - jeez fucking christ AMD .. that GloFo 14nm LPP FinFET is utter shit
  • smilingcrow - Wednesday, June 29, 2016 - link

    That seems clear, GloFo at 14nm is a dog for high performance which doesn't bode well for the high end Radeon due later.
  • Meteor2 - Thursday, June 30, 2016 - link

    So those iPhone 6S and Galaxy 7s built on it are dogs too?
  • Kvaern1 - Thursday, June 30, 2016 - link

    I believe the IPhone 6S proved TSMCs 16nm process is better than Samsungs 14nm and I don't think GLoFo's process will be able to stand up to Samsungs so yea AMD is also stuck between a rock and a hard place when it comes to GPU's now as they have to fight both NVidias superior research budget and a better process.
  • AntDX316 - Thursday, June 30, 2016 - link

    gj, I forgot to look up TDP anymore

    I knew AMD was bad but I thought they fixed it.

    If you spread and sell nvidia and many listen to the truth then the AMD people won't have jobs and can't eat because they can't buy food. Actually they can if they work for Nvidia/Intel. They would be like Einstein who left Europe to go to the US and make the Manhattan nuclear project that changed the world's future. I think if it wasn't for 9/11 then Facebook/Google/Youtube wouldn't have existed.
  • fanofanand - Thursday, June 30, 2016 - link

    You win the most asinine comment of the day award. Nicely done.
  • AntDX316 - Thursday, June 30, 2016 - link

    The biggest problem is if NVidia didn't exist then AMD would be big. It's like having a big house but because the houses around your house are way bigger it looks small. Kind of like how some of the basketball players on TV look short but they are actually really tall.
  • smackosaurus - Wednesday, June 29, 2016 - link

    So what were the other test cards clocked at? ref?
    These charts sure make the 480 look slower than every other chart/test/review out this morning. Toms has it smoking the 970.
  • RiZad - Wednesday, June 29, 2016 - link

    Tom's does not show the 480 "smoking the 970"
  • smackosaurus - Wednesday, June 29, 2016 - link

    Looking at the DX12 benches it does.. along with every other site out there
  • smilingcrow - Wednesday, June 29, 2016 - link

    Going forward DX12 is important but right now I guess people on this kinda budget aren't going to have many DX12 games so best look at the overall picture rather than cherry picking.
    It's a good value card, let's not over hype it, AMD already did that unfortunately.
  • fanofanand - Wednesday, June 29, 2016 - link

    Thank you for the timely review Ryan!!!!!
  • D. Lister - Wednesday, June 29, 2016 - link

    Look again, it is a PREview, mr. "fan of Anand". :p
  • stardude82 - Wednesday, June 29, 2016 - link

    Still waiting for the GTX 1070/1080 reviews... and 950 and 960....
  • fanofanand - Thursday, June 30, 2016 - link

    You call it a preview, I call it better than the rest as-is.
  • fanofanand - Thursday, June 30, 2016 - link

    I have been a reader here for roughly 10 years, when my name was chosen Anand was still the main writer for architectural analysis etc. I wish he were still here instead of shilling for Apple. :(
  • dcole001 - Wednesday, June 29, 2016 - link

    Based on the Benchmarks this card is not all AMD said it was. The Day of announcement they implied this card would be faster than GTX 970 and running close to GTX 980. Also the comment about running 2 - RX 480 is faster than GTX 1080 is joke. So to me this Video Card is build for today, but not for the future. With Nvidia there Benchmarks were in line with what the said, AMD are not?
  • smackosaurus - Wednesday, June 29, 2016 - link

    "...Based on the Benchmarks this card is not all AMD said it was."
    According to the reviews on this site.. which have always tended to lean towards Nvidia, check out Toms and Guru's reviews and just about any other review out there and watch it get way past the 970 in any DX12 test, and come dammmm close to a 980.
  • smilingcrow - Wednesday, June 29, 2016 - link

    What percentage of games in say a Steam survey are DX12? That would be interesting.
  • K_Space - Wednesday, June 29, 2016 - link

    Wise gamers don't tend to buy cutting edge architecture to play yester games....
  • ingwe - Wednesday, June 29, 2016 - link

    I am pretty disappointed. Sure, AMD isn't in a terrible spot but it is still underwhelming. I like to buy AMD to keep the competition alive, but the performance per watt is just poor. Is this simply due to a bad process? Is it architecture? I would really like to know. Oh well. I am keeping my 7850 for now.
  • AliciaBurrito - Wednesday, June 29, 2016 - link

    This was a huge disappointment. It trades some blows with the GTX 970, and has about the same perf/watt. You can get the 970 for around the price point of the RX 480, too. From what I've seen from other reviews, is that the overclocking on this card is incredibly poor.

    It can only manage ~1% overclock and its temperatures skyrocket by up to 10C and it can't go any higher than that. As opposed to the GTX 970 which can fair much better.

    AMD is still a generation behind Nvidia, which is not good. I hope their higher end cards make a mark.
  • smackosaurus - Wednesday, June 29, 2016 - link

    Wow.. what sites do u go to?
    try this.. Google "rx480 overclocks"
    1% someones telling a lie there
  • D. Lister - Wednesday, June 29, 2016 - link

    According to HardOCP's OC preview for the 480 with the default cooler, it OCs to ~6%. The later versions with better coolers and more flexible voltage settings would go higher, probably in the 9%-10% range.
  • Qasar - Wednesday, June 29, 2016 - link

    AliciaBurrito, maybe where you are.. but in canada, the rx 480 is $335 to $340 preorder price, the gtx 970, STARTS at $420 Cdn thats a 100 buck price difference... but most gtx 970 cards.. start at around 450 .... im thinking of grabbing a rx 480..instead of a gtx 970.. like i was ...
  • Meteor2 - Thursday, June 30, 2016 - link

    NVidia has had to cut the price of the 970 to compete with the 480. No 480, no Nvidia price cut.
  • Demi9OD - Wednesday, June 29, 2016 - link

    Don't forget about Freesync monitors being $100-$150 less than the equivalent Gsync monitors. For 1080p gaming I'd rather have a RX480 and 24" 1080p Freesync monitor than a GTX 970 without a Gsync.
  • Demi9OD - Wednesday, June 29, 2016 - link

    Personally I would wait for a 2-3 fan design rather than go with a blower though.
  • AndrewJacksonZA - Wednesday, June 29, 2016 - link

    Typo on page 6's heading:
    POWER, TEMOERATURE, & NOISE

    "TEMOERATURE"?

    Thanks for the preview. Please could you do a review with cards only in this price class and lower?

    Thank you.
  • Laxaa - Wednesday, June 29, 2016 - link

    The RX 470 is supposed to be out now as well, no?
  • Ryan Smith - Wednesday, June 29, 2016 - link

    No. A shipping date has not been announced for RX 470. Only RX 480 is shipping today.
  • poohbear - Wednesday, June 29, 2016 - link

    looks like AMD's objective of gaining market share will be met with this card! Good job AMD! They have a few months with no competition in the mainstream market. Hopefully they can make some money. Totally don't understand these people comparing it to a GTX 1070?? It's not MEANT to compete with the GTX 1070
  • Michael Bay - Wednesday, June 29, 2016 - link

    Yea, it`s meant to compete with 970, and it does so poorly.
  • Meteor2 - Wednesday, June 29, 2016 - link

    Just want to say I'm really pleased to see such a thorough 'preview' (more detailed than reviews elsewhere), published right on the dot of the embargo ending. Thank you Ryan!
  • zinfamous - Wednesday, June 29, 2016 - link

    wow, Peckingorder brings his shilling form the forums straight to the actual review. Of course he was F5ing all night to be the first nonsense post. Gotta earn that cash!

    stay classy. Anyway, where are the 1080/1070 full reviews?
  • HOOfan 1 - Wednesday, June 29, 2016 - link

    So, it looks like basically R9 390 performance for about $20 cheaper.

    Nothing revolutionary at all. I can't say it is a huge flop, but it certainly is not worth the hype.
  • jas340 - Wednesday, June 29, 2016 - link

    A lot of whining going on. I paid $660.00 for a GTX 780 exactly 3 years ago. Trust me, the card is no slouch. I've been gaming at 1440p since day one. Getting the same performance for $240ish with 8gb VRAM instead of 3 is an incredible deal. It would be great if the GTX 1080 was $240 but it isn't. It will be in three years, just under a new name
    It is hysterical how many people complain about video card performance and still game at 1080p @60 hz. This card will murder any game at that resolution.

    PS my GTX 1070 is on the way. Just because I want one. My GTX 780 has many years left in it.
  • tipoo - Wednesday, June 29, 2016 - link

    "These 32 ROPs are paired with 2MB of L2 cache, which is twice as much L2 cache per ROP as the bulk of AMD’s last-gen lineup. The increased L2 cache has a die space cost – which is now easier to pay with the 14nm process – and helps to improve performance and cut power consumption by keeping more data on-die."

    Does twice the cache per ROP and half the ROPs mean the same L2 cache as the 64 ROP parts of Olde? In which case, why would die space increase?
  • Ryan Smith - Wednesday, June 29, 2016 - link

    Hawaii had 1MB of L2 cache, or 16KB/ROP. Fiji was 2MB of L2, or 32KB/ROP. With Polaris 10, this becomes 64KB/ROP.

    The die space cost is not on an absolute basis, but on a relative basis. 2MB of L2 cache has a not-insignificant cost on 28nm cards, On a large chip like Fiji it's not so bad, but it stands out more on a smaller chip like Pitcairn or Polaris 10.
  • maccorf - Wednesday, June 29, 2016 - link

    The comments on these articles are so ridiculous sometimes. All you trolls claiming this is a failure, in what universe are you living in exactly? This fits in line with basically everything that was realistically claimed by reputable sources, and performance like this has never been seen on a $200 card before. It's pathetic how blind some of you people are. Even if it was widely reviewed as the Greatest Video Card Ever Made you'd still claim it sucked cause it was ugly. Keep shouting in those empty hallways!
  • tipoo - Wednesday, June 29, 2016 - link

    Yeah, here in Canada the 970 is almost double the price of this, and then this has 8GB of ram, updated HDMI and DP and freesync. It's not a balls out max everything card, but for 200 dollars this is doing very well.
  • smilingcrow - Wednesday, June 29, 2016 - link

    Wow, the 970 can easily be had for less than the 480 in the UK although long term the 480 makes more sense.
  • fanofanand - Thursday, June 30, 2016 - link

    "Wow, the 970 can easily be had for less than the 480 in the UK"

    Comparing used 970's to a brand new 480 isn't exactly an honest comparison to make, price-wise.
  • Notmyusualid - Wednesday, June 29, 2016 - link

    +1
  • Geranium - Wednesday, June 29, 2016 - link

    Hey Ryan,
    is't GCN 1.1 is now GCN 2,
    and GCN 1.2 is now GCN 3.
  • Questor - Wednesday, June 29, 2016 - link

    "Radeon RX 480 we’re looking at performance gains anywhere between 45% and 70%" , "RX 480 generally isn’t enough to justify an upgrade"

    A performance gain between double and three-quarters "more than" isn't a reason to justify an upgrade? That makes about as much sense a spaghetti barbecue! I am anxious to see how you qualify the two remarks a sentence apart. Especially since this performance is taking into account a market segment refresh a year and a half ago. Your conclusion is flawed.
  • Ryan Smith - Wednesday, June 29, 2016 - link

    Although it's not an objective metric, generally I'm looking for an average 65%+ performance increase to justify replacing a video card. Against 380 in particular, 480 doesn't quite reach that mark.
  • Beany2013 - Wednesday, June 29, 2016 - link

    Just checked Bench - looks like it'd be well worth a shot as an upgrade from a plain Jane 280.

    Pretty sure I'm not the only semi-casual gamer (I don't play much but I like decent performance when I do) who can justify a couple of hundred quid, but not much more, on a GPU every few years...
  • fanofanand - Thursday, June 30, 2016 - link

    I agree with your conclusion, personally I want at least 100% increase to drop the cash. Heck I'm still on an HD5850 that I bought after my 512 Mb 8800GTS died! I'm a 1080p guy so the 5850 still works.
  • D. Lister - Thursday, June 30, 2016 - link

    The 480 would give you MORE than a 100% gain in framerates over your 5850, across the board.
  • Angrychair - Wednesday, June 29, 2016 - link

    I like what the platform offers from a price/performance/power consumption standpoint, but the reference design is awful from several perspectives. The power delivery is very poorly designed and the card should have come with an eight pin PCIE power connector and it's time for reference design blowers to go away.
  • RaistlinZ - Wednesday, June 29, 2016 - link

    This is not good for AMD. There's too much of a performance gap between it and the 1070. This means that Nvidia can release a 1060 at the same $240 price point that will eat the 480 for lunch, and probably use much less power too.
  • Geranium - Wednesday, June 29, 2016 - link

    Where can I buy GTX 1060 now like RX 480.
  • smackosaurus - Wednesday, June 29, 2016 - link

    1060 spec have been leaked look it up.. the 480 will smoke it.. and at a cheaper price.
    The 1060 also suffers from less ram and the same lack of certain DX12 hardware that the 1070/1080 have. Where the 480 is fully compliant.
    Preemption=/=async compute
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    You're right, preemption isn't the same as async compute, but in addition to improved preemption the Pascal architecture can also dynamically balance load for queues executed in parallel.

    And I'm not sure if we're looking at the same leaks, but what makes you think the RX480 will smoke the 1060? Shader and fixed function resources are basically cut in half from GP104. Assuming aggressive clocks (a pretty safe assumption based on what we already know about Pascal) that should put 1060 performance at right around the RX480. And I seriously doubt 6 GB frame buffer will be a bottleneck at this performance level.
  • fanofanand - Thursday, June 30, 2016 - link

    I would be more considered about a 3GB model with the 128 bit bus width. There isn't a snowball's chance in hell that VR would be playable on that. So dollars for dollars, if the 1060 isn't "VR Ready" (whatever the hell that means these days) then AMD has the better offering (theoretically).
  • fanofanand - Thursday, June 30, 2016 - link

    concerned*. Stupid comments, please oh please give us an edit function!
  • dragonsqrrl - Thursday, June 30, 2016 - link

    Every leak I've seen points to a 192-bit bus, the memory capacities in themselves should be a clear indication of that. I'm assuming 8Gbps GDDR5, which should provide sufficient bandwidth, maintaining a similar bandwidth to performance ratio as the 1070 and 1080. If the 970 and RX480 are considered entry level VR cards, the 1060 shouldn't have much trouble gaining a similar classification.

    However, I agree that a 3GB capacity is a concern as I don't think that's quite enough for a current midrange card. That could very well be a performance bottleneck. Fortunately it looks like a 6GB card will also be available, which I imagine will be the more popular option and serve as the primary competition for the RX480.
  • fanofanand - Thursday, June 30, 2016 - link

    Stop being so reasonable, this is 2016. All online statements made must be garish, crude, and factually inaccurate.
  • D. Lister - Thursday, June 30, 2016 - link

    "However, I agree that a 3GB capacity is a concern as I don't think that's quite enough for a current midrange card."

    I have a 780Ti with 3GB VRAM, and it holds up quite well at 1080p, and reasonably at 1440p. A 3GB 1060 would do fine, considering the performance tier.
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    The fact that Nvidia also plans to launch a midrange card isn't terribly noteworthy, although I do agree that the performance and power consumption of the RX480 fall a bit below expectations, which should allow the 1060 to be more competitive. If the 1060 is based on GP106 it probably won't blow the RX480 away. It'll probably perform competitively, with perhaps a marginal avg performance advantage.
  • smackosaurus - Wednesday, June 29, 2016 - link

    "Radeon RX 480 we’re looking at performance gains anywhere between 45% and 70%" , "RX 480 generally isn’t enough to justify an upgrade"

    Yet if it were an Nvidia card.. with a 10% gain.. it would be the be all end all EVERYONE MUST GET.
  • stardude82 - Wednesday, June 29, 2016 - link

    I'm pretty sure that upgrade from a GTX 970 or a R9 390/380X.
  • D. Lister - Wednesday, June 29, 2016 - link

    "Yet if it were an Nvidia card.. with a 10% gain.. it would be the be all end all EVERYONE MUST GET."

    Oh yes, AT's Nvidia bias is legendary. That is exactly why the 1070 got a day-1 FULL review, yet the poor 480 only gets a preview. Damn your bigotry AT, THERE SHALT BE RECKONING!!!
  • BryanC - Thursday, June 30, 2016 - link

    There were 3 Fiji reviews: FuryX, Fury, and Fury Nano. But Anandtech never did a GM206 review, although it was promised several times. And of course we're waiting for a GP104 review still.

    I think Anandtech needs more people doing GPU reviews.
  • Meteor2 - Thursday, June 30, 2016 - link

    Yeah Ryan's way off there. If you already own a $199 card, this is a very solid upgrade -- the most in years.
  • AntDX316 - Thursday, June 30, 2016 - link

    It's not just a 10% gain. The micro stutter is less. The frame draw response time is less. They add new techniques to make the video total quality better. They also make it more power efficient.

    When people found out a 1080 GTX in SLI can do 4k at 100 fps, it's going to be sold out as long as the Wii.
  • smackosaurus - Wednesday, June 29, 2016 - link

    The 480 will get stupidly better as soon as the 1060 comes out and Nvidia does their tradition crippling of older cards in drivers. Bench it against the 980ti then...lol
  • D. Lister - Wednesday, June 29, 2016 - link

    First accusing AT for an Nvidia bias and now accusing Nvidia for, ahem, "tradition crippling of older cards in drivers"...? Wow, your head-wear just appears shinier and shinier with every post.
  • fanofanand - Thursday, June 30, 2016 - link

    It is a bit blinding, isn't it? :P
  • D. Lister - Wednesday, June 29, 2016 - link

    For the price, it certainly seems like a very decent bit of kit, and could be a boon for the younger people with more limited budgets looking to get in on some affordable ultra-setting 1080p/60FPS action. Let's hope AMD can keep up with the demand.
  • bedscenez - Wednesday, June 29, 2016 - link

    As a GTX 660 owner this upgrade is perfect for me. It just arrived give it two months for the drivers to mature and optimized well. The only downside is it's power consumption being a 14nm i would expect more but it consumes similar to a GTX 1070. Hope they would focus more on efficiency when Vega arrives because i can't imagine the hate AMD would get if it consumes much higher wattage than 1070 and performs worse even when using HBM2.
  • stardude82 - Wednesday, June 29, 2016 - link

    It's a Tonga die shrink rather than a new arch which is a big part of the power consumption problem. Vega is supposed to a new "tick."
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    GCN4 is a significant architectural overhaul. It's much more than just a die shrink.
  • Meteor2 - Wednesday, June 29, 2016 - link

    Article suggests otherwise.
  • dragonsqrrl - Wednesday, June 29, 2016 - link

    Relative to what? It's the most significant update since GCN 1.0.
  • Meteor2 - Thursday, June 30, 2016 - link

    We're basically seeing GPUs settle down on known uarchs just as Intel have on Core, with only small upgrades revision to revision. That race is run.
  • dragonsqrrl - Thursday, June 30, 2016 - link

    I think in this context the architectural advancements of a decade ago are less relevant for comparison than more recent trends. If it's the most significant update in over four years and two architectural iterations, I would call that significant. Again it's a lot more than merely a die shrink of GCN 1.2. As far as I can tell the article does not suggest otherwise.
  • Meteor2 - Thursday, June 30, 2016 - link

    Not really, as the article says.
  • xenol - Wednesday, June 29, 2016 - link

    I would like to think the RX 480 is AMD's GeForce GTX 750 Ti. Unfortunately, it seems a little too late to start that approach.
  • Flunk - Wednesday, June 29, 2016 - link

    Also it requires a 6-pin PCI-E connector which defeats the reason many people bought a GTX 750 Ti.
  • AnotherGuy - Wednesday, June 29, 2016 - link

    Great card for its price. If ur not happy with it prepare to spend double its price and more for a 1070. My money does not grow on trees like yours.
  • xenol - Wednesday, June 29, 2016 - link

    Someone sounds salty.
  • D. Lister - Wednesday, June 29, 2016 - link

    "My money does not grow on trees like yours."

    Maybe there's something wrong with your trees. Get new ones. Y'know, start with ones for small bills and then work your way up. Trust me it will work, and soon you'll have money trees just like everyone else.
  • fanofanand - Thursday, June 30, 2016 - link

    My wife wants to know where your money trees are at, she doesn't believe me that ours isn't producing. :(
  • D. Lister - Thursday, June 30, 2016 - link

    lol, maybe yours aren't at peak production because they're being harvested too often for her shopping sprees. ;)
  • Nickgofly2019 - Wednesday, June 29, 2016 - link

    You could have gotten this level of performance, power consumption for $250-$280 for almost a year now in the GTX 970. Sorry to say but AMD to only match a generation old card for slightly less two years later is pathetic.
  • Qasar - Wednesday, June 29, 2016 - link

    Nickgolfy2019
    maybe where you are.. but as i said in a previous post... thats not the case every where
  • fanofanand - Thursday, June 30, 2016 - link

    He's full of it. Nobody, and I do mean nobody (without company connections) was buying the 970 for $250 a year ago.
  • modeless - Wednesday, June 29, 2016 - link

    Native FP16! Is it twice the rate of FP32? If so, it might be interesting for deep learning.
  • AntDX316 - Thursday, June 30, 2016 - link

    Still slower and less efficient than NVidia's offerings.

    If only.. AMD just sold all their assets to nvidia and intel, the advancement of technology would be insane. But, if the investors/CEOs decide to insane price gouge and money milk they would have full control of that. They would be like some of the pharmaceutical companies who charge so much. I think they do that in the beginning so when the other companies make a cheaper version that works they can lower their price yet keep ALL the gains from the beginning.
  • fanofanand - Thursday, June 30, 2016 - link

    Your economic understanding could use a bit of improvement. Pharmaceutical companies are in the position to price gouge because they are often patent holders of life saving drugs. The day an RX480 saves someone's life (beyond metaphorically speaking) your argument will have merit. Nvidia dropping the price of the 970 as this card was released speaks volumes about the free market and competition being a necessity. Nobody will die without the 480 or 970. AMD selling to Intel and NVidia would essentially leave both of those companies without competition. If you don't see that as a doomsday scenario for the PC Master Race......
  • Ghiacciori - Wednesday, June 29, 2016 - link

    The big marketing trick AMD used in Polaris release was that their 480CF could beat the GTX 1080 for 400usd (seems like they were talking about the 4GB version). Seeing a single 480 can put up a good fight against the 970, it's clear the 480CF setup could never be up to the task of outperforming the 1080. Now, the real fight would be the 480CF vs GTX 1070, and considering their price tag, the CF (I'm talking 8gigs version here) not only would be more expensive than a single GTX 1070 but it'd also have twice its power draw... So, I doesn't look like a good deal to me, even assuming multi-gpu gets more support in new titles, which, if experience has something to say, seems unlikely.
    Now, whether or not this card in a single-gpu rig is a good choice or not... Now, it is since it's cheaper than the 970. However, nVidia still has to release their GTX 1060, and IF it has the same 1070/970 performance and consumption ratio, the GTX 1060 could be a better option than 480 depending on price tag.
  • bigjoe980 - Wednesday, June 29, 2016 - link

    hmm, if the Hardwareunboxed crossfire test is accurate, it'll at least have a good leg to stand on...but of course, that's only provided the results weren't fudged at all.... *shrug*
  • Flunk - Wednesday, June 29, 2016 - link

    SLI GTX 970s do actually beat the GTX 1080... In 3Dmark and basically nowhere else.
  • Ghiacciori - Wednesday, June 29, 2016 - link

    Just saw the CF results, not just in Hardwareunboxed but in techpowerup as well... They're impressive and look very promising. Still, optimization continues to be an issue with a lot of games not taking advantage of the second gpu.
  • amitp05 - Thursday, June 30, 2016 - link

    They didn't say 480CF will be $400. They said 2 480CF will be <$500 and beat the 1080 ($650).
    Clever statement cover both 4 and 8GB cards.
  • FriendlyUser - Wednesday, June 29, 2016 - link

    Thanks for the review! Good job.
    Please post crossfire, trifire and quadfire results. I wonder how it scales.
  • mdriftmeyer - Wednesday, June 29, 2016 - link

    I'll take two of these, enjoy working in Solid Modeling enabled OpenCL apps, FEA/FEM and CFD, while screaming through apps that crunch heavily on OpenCL and can work with the HDR and more. Encoding in H.265 via hardware will be a joy.

    Dual 8GB pipelines for $30 more than a 1070 FE, and you children call it a bust? Nvidia blew their market lead and the next two quarters will show it. AMD's financial statements will back up the volume sales and bring the company back into the black.
  • BrokenCrayons - Wednesday, June 29, 2016 - link

    It's a bit early to forecast the financial performance of the company in the future. We're only seeing a single GPU product's performance at the moment and the competition hasn't yet responded. I'd certainly like to see AMD do better because I benefit as a consumer from them remaining a concern for Intel and NV.

    Besides the monetary aspects, there's a lot to like about the 480. It's got lots of VRAM, performs pretty well, and the price looks reasonable. However, I'm disappointed by the power consumption. My expectations were probably unrealistic given the forces driving the industry, but I was really hoping for more power savings. I miss high end graphics cards that were powered by the bus they were installed in and fit in a single slot.
  • Peichen - Wednesday, June 29, 2016 - link

    Feel like AMD finally caught up with i5-2500K in price, performance and power but the game already moved on a long time ago. That's how AMD's CPU division died isn't it? Always a step or two behind.

    14nm card with the performance and power of a 2-years old 28nm card. If Nvidia is lazy, a die shrink 970 on 14nm will match 480 and be cheaper to make. The new 1060 will probably be even cheaper for Nvidia to make to maintain profitability.

    I am glad there are people still firmly in the AMD camp. I moved to Nvidia over a year ago but we always need someone to make sure Nvidia's mainstream pricing is not over-the-top.
  • Nickgofly2019 - Wednesday, June 29, 2016 - link

    Lmfao this is amazingly disappointing. I had the feeling when they said "VR ready" that it was just going to be around 970 performance, what Nvidia called "VR ready". Newsflash people: 970s have been selling for $260-$300 for about a year now. This "generational" leap from AMD to 16nn was only able to get them to that level of performance and the power consumption of a generation old Nvidia 28nn process that's been selling for <300. Nvidia at this point could just ask sellers to drop their 970 prices to 200 until the 1060/1050 come out and boom they'd be "competing" with this shitty, loud, power hungry for the process card. Nothing's changed, AMD is still largely behind in all aspects, got t hopes up too fast. I suspect the 1060 at $300 will have 980 performance and the 1050 compete with this at $200.
  • DonMiguel85 - Wednesday, June 29, 2016 - link

    Chances are the 1060 will be more or less a 1080 chopped in half with a 192-bit bus, so unless they price it at $250 or less I think the RX 480 will still be a much better buy.
  • AntDX316 - Thursday, June 30, 2016 - link

    There is some law of why NVidia cannot totally shut out AMD. There was news about a problem with Intel and AMD way way back and AMD was fined or sued for like $1.1B. It was some monopoly agreement or something.
  • fanofanand - Thursday, June 30, 2016 - link

    You have it back asswards. Intel was fined for their anti-competitive monopolistic policies.
  • DonMiguel85 - Wednesday, June 29, 2016 - link

    GTX 1060 is rumored to come in only 3GB and 6GB flavors with a 192-bit bus and 1280 shaders. Even if it can overclock to around ~2GHz performance will likely be between the 970 and 980. I think this card is still a better buy for the extra 2GB VRAM. Just consumes more power.
  • none12345 - Wednesday, June 29, 2016 - link

    I was hoping for a lil bit more performance, but all in all this card beats anything else in its price range by a large margin. I was expecting it to be a bit slower then a 980, but instead its a bit faster then a 970(that bit should grow to a good bit faster in the games just about to be released, or released in the next year, and it probably will be more or less a 980 in those games). If i had $200 to spend, and the 970 dropped another $80 to $200, id still get a 4gb 480 over the 970. But as of today the 970s price is way too high to even be a consideration against the 480.

    If i didnt get a graphics card in december, id be buying a 480 the second one of the manufactures releases one with an open air cooler and an 8pin power connector(i would wait and not buy the reference, but im patient).

    To those complaining that a 480 is only a bit faster then the 970 that released 2 years ago. Well DUH, but then you are comparing a $330 card to a $240 card or a $330 card to a $200 card. The 970 prices are falling like a rock because of the new cards coming out. It will continue to fall in price untill its price/pef matches what is coming out, but again DUH, thats how the market works.... As of right now, checking newegg, the 970 is way too high of a price to even be considered a contender.
  • KoolAidMan1 - Wednesday, June 29, 2016 - link

    One problem is that you really want to spend more on the 8GB model in order to really compete with the GTX 970. Getting comparable performance the the 970 for $200 is great, except that's not what's actually happening.

    I'm very curious to see what the price and performance on the 1060 end up being.
  • Teknobug - Wednesday, June 29, 2016 - link

    Amazong, a review on an unreleaesd RX 480 over a month old GTX1080?
  • Hrel - Wednesday, June 29, 2016 - link

    Looks like AMD has just become the hardware you buy because you literally can't afford something better.

    I cannot imagine even one scenario in which I wouldn't use Intel and Nvidia to build a computer, hence forth.
  • Meteor2 - Thursday, June 30, 2016 - link

    Well, if you only want to spend $100 on the CPU and $199 on the GPU, I can...
  • Hrel - Thursday, June 30, 2016 - link

    Yeah, I can afford better. Sorry you can't yet, keep working at it!
  • fanofanand - Thursday, June 30, 2016 - link

    Arrogance doesn't play well here Captain GiantWallet. Price is a consideration for 99.9% of consumers, STFU with your one-person use case.
  • praeses - Wednesday, June 29, 2016 - link

    Seems like the RX480 should have only come with 8ghz 4GB of ram which would have yielded a slight power efficiency increase and cost reduction to move from 6pin/6phase to 8 and a better cooler. 6 should have been left for the RX470. I think marketing must have got in the way again.
  • fanofanand - Thursday, June 30, 2016 - link

    GDDR5X is still expensive. At $200 some concessions had to be made, nothing to do with marketing.
  • tipoo - Wednesday, June 29, 2016 - link

    Unfortunately for AMD, they're on completely different fabs this time than Nvidia, Glofo vs tsmc. I've wondered if that's part of their efficiency disadvantage. We've seen this with the 6S load testing. That's the thing now, with different fabs, the playing field is not even, and not only does the architecture matter, but the fab process does too when comparing them to Nvidia. Which kind of sucks for AMD.

    With the iPhone it mattered less because it's mostly idle, even with the screen on, but for a high performance GPU it's the full throttle aspect that matters.

    Interesting though that TSMC will still make their high end parts (I don't know if that means just Vega, or the 300 dollar Polaris too), so maybe it's not all lost on the efficiency side if the fabs are to blame.

    I think this is to fuffil the WSA, makes sense, higher end part gets the higher end fab, the 200 dollar part isn't particularly efficient but they hit this performance and price.

    They even switched Zen to TSMC after Glofo efficiency concerns.

    So I do have hope that the more expensive TSMC parts will provide them much needed efficiency to go up against Nvidias higher end, and hopefully it doesn't mean Polaris as a whole is just inefficient.
  • T1beriu - Wednesday, June 29, 2016 - link

    1. The $300 Polaris is AIB RX480. There are no faster Polaris chips coming confirmed by Raja.

    2. Zen will not be built by TSMC. This was a fake rumor. GloFlo announced they're working on Zen. Source: http://www.extremetech.com/computing/217664-global...

    3. A couple of months back TSMC released the list of partners building chips on 16nm. AMD wasn't on that list.
  • vladpetric - Wednesday, June 29, 2016 - link

    It seems to me that the leadership of AMD still doesn't get it that "drivers matter" ... While NVidia does not generally make more computationally powerful cards, they spend a lot of resources on good drivers.

    AMD as we know it today is the marriage of two hardware-first companies (old AMD and ATI). The sad part is that after losing a lot of marketshare, market cap, etc over the last decade, good software is still a second class concern for them.
  • K_Space - Saturday, July 9, 2016 - link

    I'm not sure what world you've been living on but RTG drivers have been head & shoulder above anything ATI or even 'old AMD Radeon' delivered. Even old GCN cards continue to benefit from these long after their sell by date.
  • tynopik - Wednesday, June 29, 2016 - link

    pg1: comfortable reach it > comfortably
  • Ryan Smith - Wednesday, June 29, 2016 - link

    Thanks!
  • pavag - Wednesday, June 29, 2016 - link

    Ok. I don't want this card. I want more performance.

    The interesting thing this generation will be the gap between this and the GTX 1070.
  • Bruce Woolloomooloo - Wednesday, June 29, 2016 - link

    It amazes me how many people defend companies as if they're their friend. People need to understand for $199 it's the best performing card on the market.
  • AntDX316 - Thursday, June 30, 2016 - link

    and #1 in performance per dollar but performance per watt is what has to remain hidden
  • akamateau - Wednesday, June 29, 2016 - link

    2 RX 480 8gb for $479.98 Crossfire OUTPERFORMS GTX 1080 for $659.99!!!

    Get em on Amazon AND Newegg.
  • rav55 - Wednesday, June 29, 2016 - link

    RX 480 shines in Crossfire mode.

    2 RX 480 8gb for $479.98 OUTPERFORMS GTX 1080 for $659.99.

    BUY AMERICAN!!
  • vladx - Wednesday, June 29, 2016 - link

    Wow so many AMD shills
  • D. Lister - Thursday, June 30, 2016 - link

    You should make one more fake ID just to make sure that someone believes your nonsense.

    Secondly, Nvidia is an American company from day one, while AMD's GPU division used to be a Canadian company called ATI.

    Thirdly, anyone who bases their purchase decision solely based on the nationality of a company, is an idiot, period. Especially in this day and age where everyone and their uncle is outsourcing to China and India.
  • AntDX316 - Thursday, June 30, 2016 - link

    for tools, food, and gas 100% agree buy American

    If you take all your electronics, found out you have to pay like 4x of what they are worth that you didn't to someone like the IRS because you decided to get the same exactly quality but American, would you still agree to buy American? I'm just saying all the money you saved by going outsource has to be paid for because we have to pretend it had to be made from here with all the added cost of salaries that everyone had to do from here. Would you file for bankruptcy or take a huge loan out?
  • catavalon21 - Wednesday, July 13, 2016 - link

    The review HardOCP did on the 480 in CF mode against the 1080 and 1070 suggests statement missed the mark. Great review they did. If you're serious about the subject, give it a look.
  • catavalon21 - Wednesday, July 13, 2016 - link

    The review HardOCP did on the 480 in CF mode against the 1080 and 1070 suggests YOUR statement missed the mark...if only I could type, or proofread, or something.
  • AbbieHoffman - Wednesday, June 29, 2016 - link

    Well! I was going to buy the RX 480 to replace my GTX 970, But it looks like there is no point! I really thought the 480X was going to perform better than the 980.
  • Meteor2 - Thursday, June 30, 2016 - link

    If you want to replace your 970 you're going to have buy a 1070.
  • Laststop311 - Thursday, June 30, 2016 - link

    Hopefully this brings price of 1070 down to 299.99 for the custom ones.
  • vladx - Thursday, June 30, 2016 - link

    Good luck with that
  • amitp05 - Thursday, June 30, 2016 - link

    AMD need to push performance UP by 15% and power consumption DOWN 15%. To make this card truly tempting and to match the hype they created.

    But I'll still buy AMD. We need them :(

    AMD: Please don't Hype Zen too much. It feels bad when expectation you created are not met.
  • AntDX316 - Thursday, June 30, 2016 - link

    If you want to support AMD just get XB2 and PS5.
  • D. Lister - Thursday, June 30, 2016 - link

    Nah, last time my GPU died, I spent several months on a crappy IGPU. AMD, or ANY company for that matter, didn't come for my support. Then why should I support any of them?
  • GPU2016follower - Thursday, June 30, 2016 - link

    I don't even know why I come here maybe just by curiosity but I don't trust anandtech and their biased reviews always in favor of Nvidia cards. In the majority of other websites' reviwers just to name few: Techspot, Forbes, Polygon, arstechnica, PCgamer, ... the RX 480 easily dominates the GTX 970 in 95 % gaming benchmarks by an average from 5 and up to 10fps and the RX 480 manages in very few cases to trail the GTX 980 by only 2 or 3 fps below.

    I think I will wait for the custom versions to see if they can offer better performance, maybe we will see the Sapphire, ASUS, XFX RX 480 beefed with their more powerful OC versions to compete against the GTX 980.
  • AntDX316 - Thursday, June 30, 2016 - link

    The microstutter is way higher than nvidias offerings..

    Unfortunately certain things aren't made common like adaptive vsync/gsync, micro stutter, frame draw response time. FPS is what most of the gamers look for and soley chase. I believe because they are too busy thinking about what they were taught before and/or too busy with whatever else they are doing in life other than keeping up-to-date of what does matter for the best gaming experience and why. It took a while for people to move away from the you can only see 24/30 fps and no more.
  • Badelhas - Thursday, June 30, 2016 - link

    No Virtual Reality benchmarks? This doesn't sound like AnandTech. That's the main reason I am interested in gpu's nowadays. What a disappointment...
  • T1beriu - Thursday, June 30, 2016 - link

    As you can read in the headline, this was a PREVIEW.
  • Ryan Smith - Friday, July 1, 2016 - link

    The VR benchmarking situation is currently poor. I'll have something for the full review, but we don't have any good VR benchmarks like we do standard displays. VR Mark and VRScore should be released soon.
  • IntoGraphics - Thursday, June 30, 2016 - link

    Any MXGP2 players here?
    What are your thoughts about RX 480 8GB playing MXGP2 at 1440p@60fps?
    Recommended GPUs are GTX 970 and R9 390. But this video shows a GTX 970 in-game already at 100% at 1080p with maxed out settings.
    I'm assembling an X99 based PC. i7-6800K, 32GB 3200MHz DDR4.
  • IntoGraphics - Thursday, June 30, 2016 - link

    Link to the vid : https://youtu.be/e8y3IR-7YIU
  • D. Lister - Thursday, June 30, 2016 - link

    You're "into graphics", yet instead of going for a "mid-end CPU/mid-end GPU" combo, you would rather go for "high-end CPU/Low-end GPU"? If e.g., you went instead with an i5/1070, you would probably be able to down-sample from 3K to 1440p and still have a locked 60FPS.
  • fanofanand - Thursday, June 30, 2016 - link

    He is trying to get "followers" for his youtube channel. Don't even click the link or you will drive up his view count, which will only encourage him to spam comments with more links to his video.
  • D. Lister - Thursday, June 30, 2016 - link

    No worries, I never click on random forum links, unless it is porn, lol. Just kidding.
  • IntoGraphics - Thursday, June 30, 2016 - link

    Are the reference RX 480 cards going up to other reference cards in this comparison?
    Or were some or all of their competitors equipped with aftermarket coolers and factory overclock?
    What was compared here?
  • sparrowkhx - Thursday, June 30, 2016 - link

    AMD hype again, 2 RX 480s probably beat a single 1070 while consuming twice the power and ultimately being a similar cost... They definitely aren't going to beat a 1080 and I'm even more glad that I purchased an EVGA 1070 FTW for 429.

    Additionally, I haven't been on Anandtech since the 1080 preview, seriously what is the deal with all these previews and missing reviews? I'm more interested in reading the reviews than random articles and other places are actually doing full reviews not previews.
  • fanofanand - Thursday, June 30, 2016 - link

    Amazing that you have managed to purchase a card no one has for sale. Not Newegg, not Amazon, go ahead and tell us all where you were able to procure an unreleased card?
  • catavalon21 - Wednesday, July 13, 2016 - link

    "2 RX 480s probably beat a single 1070 while consuming twice the power"

    Actually, very close to double the power in a particular gaming selection (87% more was their assessment)

    http://www.hardocp.com/article/2016/07/11/amd_rade...
  • slickr - Thursday, June 30, 2016 - link

    The only people who care and cared about power consumption, whether its 150W or 170W were/are the paid and bought for shills who write for Nvidia and who commit fraud and scam against their readers, they are committing crimes.

    Nvidia won because of the shills.
  • D. Lister - Thursday, June 30, 2016 - link

    You get it man, you've got it all figured out! Yours is the level of comprehension that lesser men like myself strive for all their lives, and still fall short by miles. To hell with trivialities like hardware components, a mind like yours should be tackling questions like "the meaning of life" and whatnot.
  • fanofanand - Thursday, June 30, 2016 - link

    A+ on the hyperbole (is your name an homage to Dean Lister, the MMA fighter?) but his original statement isn't entirely wrong. Few desktop users will notice the difference between 150 watts and 170 watts. I don't know about the tinfoil hat stuff, but the original premise isn't completely invalid.
  • D. Lister - Friday, July 1, 2016 - link

    "is your name an homage to Dean Lister, the MMA fighter?"

    Actually it is an homage to "David 'Dave' Lister", from the "Red Dwarf" books. :)

    "Few desktop users will notice the difference between 150 watts and 170 watts. I don't know about the tinfoil hat stuff, but the original premise isn't completely invalid."

    The thing is, no one terribly cares for the 480 slightly overshooting the TDP. The thing they're complaining about is AMD's choice to stick on a 6-pin connector on the ref card, instead of an 8-pin one, which forces the card to compensate by over-drawing from its slot. Nonetheless, I was really more concerned about the tinfoil hat stuff.
  • andrewaggb - Thursday, June 30, 2016 - link

    Nah, power consumption is a big deal to some people. I agree that 150W vs 170W is nothing, but pulling an extra 100W or more for the same performance isn't great. It's heat that ends up in the room with you, it's bigger heatsinks or faster fans etc.

    In laptops power consumption is everything. AMD needs to get it under control, simple as that. I think the RX 480 and 470 will be fine, but if they are already pulling 1070 numbers what will Vega pull? It'll probably use HBM2 and get power savings there, but what about the GPU itself?
  • jayfang - Thursday, June 30, 2016 - link

    Nice, something that makes sense to replace my HD 6850 with.

    Intel / AMD please step up - still rocking i5-2500K
  • drgigolo - Thursday, June 30, 2016 - link

    Where is the GTX 1080 review? Or 1070 for that matter.
  • cocochanel - Thursday, June 30, 2016 - link

    You didn't read all the comments. It's coming out in a few days.
  • ffleader1 - Thursday, June 30, 2016 - link

    Can you maybe redo the test with updated driver.
    It seems that newer driver really makes a HUGE difference.
    https://www.reddit.com/r/Amd/comments/4qiffg/rx_48...
  • Ryan Smith - Friday, July 1, 2016 - link

    This was done with the latest driver to begin with: 16.6.2. The press was not distributed any other driver AFAIK.
  • Falko83 - Thursday, June 30, 2016 - link

    I hope that in the review you can also address 1080p 144hz gaming.
  • davide445 - Thursday, June 30, 2016 - link

    "Relative to last-generation mainstream cards like the GTX 960 or the Radeon R9 380, with the Radeon RX 480 we’re looking at performance gains anywhere between 45% and 70%, depending on the card, the games, and the memory configuration. As the mainstream market was last refreshed less than 18 months ago, the RX 480 generally isn’t enough to justify an upgrade"
    This is something I didn't understand. 45-70% performance increase it's not enough for a card that request less power and cost less?
    Also where is the problem with the fact the power consumption is above equivalent Nvidia? At 150w I doubt anyone upgrading need a new PSU. I suppose below a threshold power request it's not part of the game anymore, otherwise next request will be that a GPU will produce power and not need some....
  • dragonsqrrl - Thursday, June 30, 2016 - link

    Ryan addressed some of this in response to an earlier comment:

    "Although it's not an objective metric, generally I'm looking for an average 65%+ performance increase to justify replacing a video card. Against 380 in particular, 480 doesn't quite reach that mark."

    150W TDP wouldn't be a problem in a vacuum, but unfortunately AMD is competing against Nvidia, not just at the midrange, but the whole product stack including the high-end. That means AMD is going to have more trouble scaling performance this coming generation, not unlike the last. In modern microarchitectures performance and efficiency are basically the same thing because we've hit the TDP ceiling for these form factors. I think Ryan actually explained it quite elegantly:

    "power efficiency and overall performance are two sides of the same coin. There are practical limits for how much power can be dissipated in different card form factors, so the greater the efficiency, the greater the performance at a specific form factor. This aspect is even more important in the notebook space, where GPUs are at the mercy of limited cooling and there is a hard ceiling on heat dissipation."
  • davide445 - Thursday, June 30, 2016 - link

    Maybe I'm asking too much objectivity. Asking for +65% for performance increase is nothing without a specific reason. 960 vs 760 show a 6% increase in 3dmark performances.760 vs 660 show a 20% increase. 660 vs 560 show a 55% increase (data from GPU boss). So why a 40-70% (average 55%) is not enough? It's the same you can achieve with two generations of Nvidia equivalent.
    About TDP ok for competing, but what I'm saying is it's not more relevant. Ok when AMD GPU was 200-250w hungry and Nvidia equivalent was 100w less, but with 150w really make difference for anyone for maybe 50w? Also didn't understand about the whole stack competition, we need to compare same price level: 480 vs 380 or 390 or 960 or 970, 1070 is available on Newegg for $430 min, 1060 is not available at all.
  • dragonsqrrl - Thursday, June 30, 2016 - link

    It sounds like you're under the impression he recommended upgrading from a 760 to a 960, he did not.

    "Also didn't understand about the whole stack competition"

    Then am I correct in assuming you don't understand the significance of TDP ceilings for a given form factor? If you have a less efficient architecture, you're going to have a less performant GPU once you approach that ceiling. It's not as concrete for desktop discrete GPUs as it is for notebooks, but there are still upper limits, which tend to be around 250-300W. AMD can always price competitively, but the last thing they need right now is a repeat of last gen. They need competitive 'products' in their lineup, and for AMD, in terms of financial and competitive viability, that means a lot more than just price/performance ratios. This has less to do with the RX480 right now and more to do with prospects for the rest of the generation.
  • Ananke - Thursday, June 30, 2016 - link

    Let me simplify for you: The question is "What can I get for $200?". RX480, RX380, GT960, GT950.
    Very simple choice at the moment.
  • dragonsqrrl - Thursday, June 30, 2016 - link

    @Ananke

    Yes, yes it is. But if you read the last sentence in my previous comment, and the rest of my responses the the thread, you'd hopefully realize that's besides the point.
  • davide445 - Thursday, June 30, 2016 - link

    I suppose depends what these reviews are expected for. IMHO (and of course this is my personal opinion) these need to be guides useful for informed purchase, similar to what you did expect from stock investment advice. So you need to be specific about your assumptions and audience.
    These cards are not for mobile, so mobile is not in the equation. We are discussing about a specific card, not a corporate or tech startegy. We are discussing about mostly gaming or DX performances, so the audience are using a PC and didn't consider enough integrated GPU. So need to decide mostly if upgrade, not to purchase the first one considering how PC market is going. So the decision this review need to address is what GPU on the market this people need to purchase, considering a target price and average PC specs.
  • Hurn - Thursday, June 30, 2016 - link

    The real question, here, is why the R9 380 beats the pants off of the R9 380X in many tests.
    Example: Dirt Rally 1920x1080 Ultra Quality. The 380 gets 64.3, while the 380X only gets 33.1. Half the speed from a card that's supposed to be faster?? Investigation needed!
  • Ryan Smith - Friday, July 1, 2016 - link

    Thanks.

    It looks like I errored when transcribing the results into the database. I've gone through and corrected the charts.
  • Archie2085 - Friday, July 1, 2016 - link

    @Ryan Any possibility you can cover whats leakage on this process . Lower Temps leading to lower power draw without reducing clocks??? either changing coolers or lowering ambient temp by blowing cold air??
    Been seeing posts of disproportionate increase in temps and powerdraw
  • FourEyedGeek - Thursday, June 30, 2016 - link

    I'll wait to RX 490.
  • Locut0s - Friday, July 1, 2016 - link

    So I guess like the 1080 this will be a "preview" that we will never get an actual review of.
  • pencea - Friday, July 1, 2016 - link

    Yup exactly.

    Anand still hasn't done a review of both GTX 1070 or 1080, and now the 480. While other major sites have already done both reference and custom reviews, along with SLI testing on the Nvidia cards.

    Unacceptable for a site like this.
  • X-Alt - Friday, July 1, 2016 - link

    Let's look at it this way

    2900XT->7970->R9 290X->Fury X-???
    6970->7950->R9 290->Fury Nano\390X->???
    6950->7870->R9 280X->R9 380X->RX480
    6870->7850->R9 270X->R9 380->??

    All that matters is how the 1060 stacks up
  • dani_dacota - Friday, July 1, 2016 - link

    Ryan, did you have a separate 4GB card to test or did you switch the vram config in the bios. Reason I ask is because I am wondering if the extra 4GB of vram might be pushing the RX 480 to post power consumption figures as high as the 970 even though the gpu chip itself is more efficient. If the extra 4GB of vram consumes 15-20W of power itself than the power efficiency numbers might improve significantly compared to the 970.
  • FreeKill - Friday, July 1, 2016 - link

    *Wondering what year we'll get the REAL full version of this GPU as a re-badge* I completely understand chip harvesting to sell the highest percentage of chips out of a fab, But I can't help but be pissed off with AMD about advertising to everyone that this is a fully enabled chip as I highly doubt it is, I believe it has 40 CU's and 2560 SP's and is currently neutered in order to hit their marketing points (power, price) They've done this with virtually every newly released GPU for years now so I should be used to it but time will tell if suddenly there's a 40 CU, GDDR5X Polaris based card in 12 months badged as a 570. I think Tahiti (7970) and Fiji (Fury X) are the only two GPU's they haven't neutered and lied about from day 1

    Several examples:
    Tonga:
    http://www.kitguru.net/components/graphic-cards/an...
    Hawaii:
    http://forums.guru3d.com/showthread.php?t=385046
  • Tams80 - Saturday, July 2, 2016 - link

    I don't understand your issues with this, other than perhaps a concern for wasting natural resources.

    As long as AMD provide what they claim to, at the price they state; then what is the issue? Sure, they could make something better, but that is not the market they have targeted with this product.
  • D1v1n3D - Friday, July 1, 2016 - link

    I think it is funny did all these Nvidia people forget what 970 landed on cost wise and then the limped along 3.5gb ram. AMD has many more models to come for instance the GDDR5X models and the hbm2 models AMD is just bringing in cash flow off a very capable mid-range card i will be buying one for my mini itx that currently is running 1gb 6950hd talk about an upgrade and at such an amazing value and allowing me to not stress my 450w PSU you guys and your unrealistic crossfire the latency alone would drive me up the wall to much fluctuation from so many sources gpu x 2 and cpu. maybe when it's two hbm2 cards crossed maybe the latency will be nonexistent.

    currently waiting for fm2+ refresh I sure hope they have one more before fm3/+ boards come out.
  • monohouse - Sunday, July 3, 2016 - link

    it's a prayr-view, pray for AMD !
  • monohouse - Sunday, July 3, 2016 - link

    "Wise gamers" ? is such a thing even exists ?, do you know I play Doom 2 on a GTX 780 Ti ? so what ? more graphic resources are never a bad thing, I also play many DX9 games on windows XP so what ? a technologically fast video card benefits not only the bloat-filled spyware-infested windows 10, it's also good for accelerating old software (that is, if you can make it work / are given drivers to make it work)
  • monohouse - Sunday, July 3, 2016 - link

    but now that you mention smart gamers ? this is how I see it: there are 2 types of gamers:
    the first type are the ones that don't know what they are going to run on the system, their load is variable and they jump from one game to another, never knowing which game they will run next because they wait it's release

    and the second are the ones that do know what they will run ahead of time and will not run anything but what they know

    I classify them as static and dynamic types of gamers, so the RX 480 is it good or not good ? the question in my opinion is more complex than it seems, because I believe that there is more to video cards than just performance watts heat noise and price, you have to look at the bigger picture, at things like OpenCL (and you know that AMD are pretty good at that), at things like hardware quality (how precise is the graphic calculation) (which is also a department where AMD is better (usually, excluding the brilinear filtering of the 9600XT) and also have to look at image quality produced by the card(s), also look into stuttering (if there is any, and how much) all these aspects are just hardware related, but hardware does not exist on it's own, it exists together with driver, so not less important than looking at hardware quality is also software quality (a department which is usually pretty bad with AMD) how stable is the driver ? is there any BSOD ? is there any rendering bugs ? and of corse the performance of the driver.

    how does this relate to type of gamers ? consider it: you know what you are going to run in the card, it could be a OpenCL program or some very specific games - then all you need is to find out how the card (with it's driver) is handling this specific load - and based on that decide if you want to buy/use it

    but for the dynamic gamer this is much more complex because there is no way to predict what the software/load is going to be, so the decision is more difficult whether or not to buy, so here is my way:

    static gamer buy what suits your load, if RX 480 can run your OpenCL/games well enough, stable and fast enough looking good then you have no problem to buy it

    but for dynamic gamer I have to disagree on all the posts mentioned here, and all the estimationing cliche "1080p gaming", there is no line you can draw and all games having an equal load at a given resolution, there is no such thing as "this card will last me 2 years at 1080p", for the dynamic gamer this strategy is flawed, because nobody knows the future (that being said, console generations do have an impact because modern games are console games) and in addition some games are pre-built for a specific FPS target, and game engines are differently designed. so for the dynamic gamer I would not recommend RX 480, because it's performance already struggles with games that exist in the present, so for the dynamic gamer the best bet is the highest performing card (only under the condition that it runs every already released game at higher than the required FPS for you)
  • K_Space - Saturday, July 9, 2016 - link

    Apologies for reading your comment so late, however it is was not posted under the parent thread:
    1) why wouldn't wise gamers exists? :) "the gamer self" does not exist in vacuum, a wise human who plays games is a wise gamer.
    2) your analysis of the static vs dynamic gamer is really well put, though your use cases are broader, so user is more apt than gamer (and really fit with the gamer does not exist in a vaccum statement earlier). You have put plenty of caveats that we almost essentially agree: whilst no one can predict the future precisely, one can certainly develop some foresight by looking at trends/as well as typical use scenarios to predict future use cases. It's what IT departments do all the time anyway, "wise" gamers are no different :) Just as you'd predict after DX11 was released that most if not all AAA games will feature the buzz word tessellation, you would rightly predict that with the current gen of consoles future AAA games will be more VRAM heavy than current ones. multi core CPUs will be utalised more than they would in current DX11 games, etc. Ditto with the 1080p statement, if im planning a future GFX purchase I'd factor in if I'm considering to stick to native 1080p display or a denser display and act accordingly. By your definition a dynamic gamer does not know what their future games will demand, but their interests are typically static even if the games change; thus if hypothetically speaking i likes MOBA or indie games I'd hazard a good guess that a 480 and not a 1070 card is what I need for future indie/MOBA games. Remember, these "predictions" don't need to be pin point accurate, just something that will get me to my next upgrade cycle.
    3) RTG has been quite spectular with their driver support releases. previous GCN cards have been reaping these benefits to this day. I've the additional benefit of retroscopy in considering how they promptly dealt with the 480 powergate fiasco.
  • dmark07 - Sunday, July 3, 2016 - link

    I'm a little confused as to when benchmarks were performed for the NVIDIA 1070FE card. The only articles on Anandtech for this generation of NVIDIA cards only has benchmarks for the 1080. I wish your team would have written an article about the 1070 first before adding numbers for them to this article. As it appears, this either indicates someone used the 1080 numbers and labeled them as 1070FE or this was a biased move with unpublished numbers that could have been pulled out of thin air... This is very uncharacteristic of Anandtech. Don't get me wrong, I've been looking forward to a full breakdown of the 1070 but the Anandtech team use to stick to the standard of performing a write up on a card prior to using benchmarks for that card in comparisons.
  • Ranger1065 - Tuesday, July 5, 2016 - link

    Speaking as someone who has happily visited this site for many years, it's clear the Anandtech "team" is not what it used to be. I'm tired of visiting here only to NOT find the reviews I'm interested in and so I visit Anandtech less and less frequently. It's a shame because I know the staff are capable of writing really good reviews but my patience and faith in Anandtech is just about exhausted. Other sites may not have the detail that Anandtech does (if and when they actually post a review) but the difference is not huge and at least they do actually post reviews that people care about.
  • Murloc - Tuesday, July 5, 2016 - link

    this is not a scientific publication, they can put whatever "unpublished" numbers they want in there.
    But yes, the issue is that they've become slow at posting and are unable to solve the issue.
  • Keinz - Tuesday, July 5, 2016 - link

    Each and every time I bother reading the comments I'm reminded why free speech for the dumb is a very, very, VERY bad thing.
  • beast6228 - Wednesday, July 6, 2016 - link

    Too bad you didn't add the r9 295x2 to the benchmarks it would have destroyed most of the cards.
  • carex - Wednesday, July 6, 2016 - link

    where is 970 in the Battlefield 4 bench?
  • fxv300 - Thursday, July 7, 2016 - link

    Got mine yesterday and installed it.
    Tried it using Astra 28.2 Freesat transmission 4K @ 50 Fps
    https://youtu.be/4r8lUJGZT84
    you can see all the sensors and gpu power usage.
    All good so far
  • Oxford Guy - Friday, July 8, 2016 - link

    "The $199 price tag means that AMD can’t implement any exotic cooling or noise reduction technologies, though strictly speaking it doesn’t need them"

    1) Dual fan open air coolers are hardly "exotic".

    2) It does need them. 50 dB is awful.
  • k3rast4se - Monday, July 11, 2016 - link

    I've created at petition regarding the lack of review on anandtech.com. Basically we are just asking for an explanation on the situation as there might be genuine reason for this. Please support this!

    https://twitter.com/k3rast4se/status/7525464697317...
  • SaolDan - Thursday, July 14, 2016 - link

    You must be joking. Ryan or any of the other writers dont owe you any explations at all. Get a life dude.
  • isik - Tuesday, July 12, 2016 - link

    www.egemenweb.com thanks
  • carex - Tuesday, July 19, 2016 - link

    what cheating seriously 970 completely removed from the Battlefield 4 bench against 480 and they didnt test the vulkan at all must have been paid by nvidia
  • mikato - Friday, July 29, 2016 - link

    Radeon HD 6950 (unlocked to mostly 6970) here. I think it may be a good time for an upgrade finally :) The price is right!
  • Olegendary - Wednesday, September 7, 2016 - link

    Sorry for out of topic comment but need your help, would anyone suggest the radeon RX 480 but for CPUs. RxX 480 seems to be a candy for that price but i don't know any cpu which is powerful and cheap (well 200$ isn't cheap but u know what i mean) thanks

Log in

Don't have an account? Sign up now