Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading
by Daniel Williams & Ryan Smith on February 24, 2016 1:00 PM ESTThe Performance Impact of Asynchronous Shading
Finally, let’s take a look at Ashes’ latest addition to its stable of DX12 headlining features; asynchronous shading/compute. While earlier betas of the game implemented a very limited form of async shading, this latest beta contains a newer, more complex implementation of the technology, inspired in part by Oxide’s experiences with multi-GPU. As a result, async shading will potentially have a greater impact on performance than in earlier betas.
Update 02/24: NVIDIA sent a note over this afternoon letting us know that asynchornous shading is not enabled in their current drivers, hence the performance we are seeing here. Unfortunately they are not providing an ETA for when this feature will be enabled.
Since async shading is turned on by default in Ashes, what we’re essentially doing here is measuring the penalty for turning it off. Not unlike the DirectX 12 vs. DirectX 11 situation – and possibly even contributing to it – what we find depends heavily on the GPU vendor.
All NVIDIA cards suffer a minor regression in performance with async shading turned on. At a maximum of -4% it’s really not enough to justify disabling async shading, but at the same time it means that async shading is not providing NVIDIA with any benefit. With RTG cards on the other hand it’s almost always beneficial, with the benefit increasing with the overall performance of the card. In the case of the Fury X this means a 10% gain at 1440p, and though not plotted here, a similar gain at 4K.
These findings do go hand-in-hand with some of the basic performance goals of async shading, primarily that async shading can improve GPU utilization. At 4096 stream processors the Fury X has the most ALUs out of any card on these charts, and given its performance in other games, the numbers we see here lend credit to the theory that RTG isn’t always able to reach full utilization of those ALUs, particularly on Ashes. In which case async shading could be a big benefit going forward.
As for the NVIDIA cards, that’s a harder read. Is it that NVIDIA already has good ALU utilization? Or is it that their architectures can’t do enough with asynchronous execution to offset the scheduling penalty for using it? Either way, when it comes to Ashes NVIDIA isn’t gaining anything from async shading at this time.
Meanwhile pushing our fastest GPUs to their limit at Extreme quality only widens the gap. At 4K the Fury X picks up nearly 20% from async shading – though a much smaller 6% at 1440p – while the GTX 980 Ti continues to lose a couple of percent from enabling it. This outcome is somewhat surprising since at 4K we’d already expect the Fury X to be rather taxed, but clearly there’s quite a bit of shader headroom left unused.
153 Comments
View All Comments
Kouin325 - Friday, February 26, 2016 - link
yes indeed they will be patching DX12 into the game, AFTER all the PR damage from the low benchmark scores is done. Nvidia waved some cash at the publisher/dev to make it a gameworks title, make it DX11, and to lock AMD out of making a day 1 patch.This was done to keep the general gaming public from learning that the Nvidia performance crown will all but disappear or worse under DX12. So they can keep selling their cards like hotcakes for another month or two.
Also, Xbox hasn't been moved over to DX12 proper YET, but the DX11.x that the Xbox one has always used is by far closer to DX12 than DX11 for the PC. I think we'll know for sure what the game was developed for after the patch comes out. If the game gets a big performance increase after the DX12 patch then it was developed for DX12, and NV possibly had a hand in the DX11 for PC release. If the increase is small then it was developed for DX11,
Reason being that getting the true performance of DX12 takes a major refactor of how assets are handled and pretty major changes to the rendering pipeline. Things that CANNOT be done in a month or two or how long this patch is taking to come out after release.
Saying "we support DirectX12" is fairly ease and only takes changing a few lines of code, but you won't get the performance increases that DX12 can bring.
Madpacket - Friday, February 26, 2016 - link
With the lack of ethics Nvidia has displayed, this wouldn't surprise me in the least. Gameworks is a sham - https://www.youtube.com/watch?v=O7fA_JC_R5skeeepcool - Monday, February 29, 2016 - link
Finally!..I can't even grasp the concept of how low rez and crappy the graphics look on this thing and everybody is praising this "game" and its benchmarks of dubious accuracy.
It looks BAD, its choppy and pixelated, there is a simple terrain and small units that look like sprites from Dune 2000 and this thing makes an high end GPU cry to run at 60Fps's??....
hpglow - Wednesday, February 24, 2016 - link
No insults in his post. Sorry you get your butt hurt whenever someone points out the facts. There are few Direct X 12 pieces of software outside of tech demos and canned benchmarks avalible. Nvidia has better things to do than appease the arm-chair quarterbacks of the comments section. Like optimize for games we are playing right now. Weather Nvidia cards are getting poor or equal performance in DX 12 titles to their DX 11 counterparts is irrelevant right now. We can talk all we want but until there is a DX 12 title worth putting $60 down on and that title actually gains enough FPS to increase the gameplay quality then the conversation is moot.Your first post was trolling and you know it.
at80eighty - Wednesday, February 24, 2016 - link
there is definitely a disproportion in responses - in the exact inverse you described.review your own post for more chuckles.
Flunk - Thursday, February 25, 2016 - link
What? How dare you suggest that the fans of the great Nvidia might share some of the blame! Guards arrest this man for treason!Mondozai - Thursday, February 25, 2016 - link
"No insults in his post."Yeah, except that one part where he called him a fanboy. Yeah, totally no insults.
Seriously, is the Anandtech comment section devolving into Wccftech now? Is it still possible to have intelligent arguments about tech on the internet without idiots crawling all over the place? Thanks.
Mr Perfect - Thursday, February 25, 2016 - link
Arguments are rarely intelligent.MattKa - Thursday, February 25, 2016 - link
If fanboy is an insult you are the biggest pussy in the world.IKeelU - Thursday, February 25, 2016 - link
"Trolling" usually implies deliberate obtuseness in order to annoy. Itchypoot's posts reads like a newb's or fanboy's (likely a bit of both) who simply doesn't understand how evidence and logic factor into civilized debate.