Original Link: https://www.anandtech.com/show/2004



Introduction

Picking up from where we left off with the X800 Pro, we are back again to wrap up our Looking Back series on the evolution of driver performance. As we saw previously, ATI posted some significant performance gains for both the R420 architecture and it's relatively similar R300 predecessor. Improving performance over 20% in most games, and in some cases doubling it entirely, ATI showed they were capable of working a great deal of performance out of their drivers long after the hardware was formed.

Of course taking apart the Catalyst drivers is just the beginning. NVIDIA, with their even more infamous reputation for driver-based performance increases, can't be ignored, so as promised we are back to take a look at just what NVIDIA managed to do with their ForceWare drivers and their NV40-based 6800 Ultra. With a new architecture behind their products as opposed to ATI's more time-tested R300/R420 architecture, we've certainly had high hopes for what NVIDIA could do with the NV40. As we mentioned previously, ATI managed to set a very high bar with the X800 Pro, so now with the 6800 Ultra taking its turn under the knife, we finally get a chance to answer the burning question: who really got more performance out of their drivers in the last generation, and does it really make a difference?

With that in mind, our overall objective in doing this still has not changed. As a recap from our first article:

When the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?



NV40 & The Test

Coming off of the extremely poor showing of their 5000 series and its associated Detonator drivers, the NV40 design represents an interesting design from NVIDIA born amid the bitter taste of driver scandals and overall inferior performance. NV40 not only brought the introduction of Shader Model 3.0, further blurring the lines between a CPU and a GPU with the addition of more high-level programming abilities, but also reintroduced the world to SLI when the NV40 went to PCI-Express. Overall the NV40 is a very different design than anything preceding it, giving NVIDIA a blank slate from which to work out performance improvements.

It should be noted that our driver selection policy is a little different for NVIDIA than it was with ATI, due to NVIDIA's previously inconsistent official driver release schedule, which did not see official drivers released often enough to fill our intended 2-month gap between driver revisions. Because of this, some of the drivers used in this test are NVIDIA driver releases that were not directly released to the public by NVIDIA itself (e.g. releases by OEMs who received multiple NVIDIA builds). However given the simply enormous number of such drivers, we used only Windows Hardware Quality Labs (WHQL) certified drivers, which means these are drivers NVIDIA was confident enough to release in a final form and submit to testing to Microsoft. Overall these drivers still stick to an approximate 2-month gap, making the results as comparable as possible with our ATI Catalyst results.

What hasn't changed is our primary game list, though we'll be adding a few additional titles with more limiting testing (see page 11). As with the X800 Pro, we'll be conducting extensive testing with the following games:
  • X2: The Threat
  • Doom 3
  • Half-Life 2
  • Far Cry
  • Battlefield 2
  • 3DMark05
  • D3DAFTester
We have also set aside several games that are console ports, for a comparison on how this influences performance changes(more on that later). Those games are:
  • Final Fantasy XI, Benchmark 2
  • Need For Speed Underground 2
  • The Chronicles of Riddick
  • Serious Sam 2
Our benchmarking setup for this series also remains unchanged, and is once again the following:

Benchmarking Testbed
Processor: AMD Athlon 64 3400+(S754)
Motherboard: Abit KV8-MAX3
Memory: 2GB DDR400 RAM 2:2:2
Hard Drive: 120GB Maxtor DiamondMax Plus 9
Power Supply: Antec TruePower 430W

All tests were done at 1280x1024 unless otherwise noted.



D3DAFTester

As always, our first test is the venerable D3DAFTester, which is a detailed but easy way to spot any global driver changes that would affect anisotropic filtering quality. It won't show us any application-specific changes, but it is still always a good place to start.





Given that this tool is a way of checking "correctness" for anisotropic filtering (and has been for a long time), we seldom expect to actually see any changes with D3DAFTester. It doesn't disappoint us here, producing the exact same graphs for all the versions of the ForceWare drivers tested.



Far Cry

Even as early as with the GeForce 5000 series versus ATI's 9000 series, Far Cry has been a game that has performed better on ATI's hardware, making it a traditionally sour note for NVIDIA given its status as the first game to really push the abilities of SM2.0+ . Given the performance gap, however, it's also a title that seemingly offers NVIDIA great motivation to optimize, so let's see how they did.

Far Cry
Far Cry HQ


First off, we have dropped the results for the 60.72 drivers, the 6800 series launch drivers, as we encountered numerous rendering errors within Far Cry. This seems to be due to an interaction between our patched version of Far Cry and these drivers, however as we continue on, we'll note a remarkable pattern of problems with these drivers.



Moving on to the 61.76 drivers where this problem was corrected, we're a bit surprised to see that given NVIDIA's traditionally trailing performance in this game, they were unable to improve the 6800 Ultra's performance in this game past a modest improvement when AA/AF are enabled. We honestly would have expected NVIDIA to have done something here in the past year and a half.


ForceWare 61.76 versus 84.21
Mouse over for 84.21


As for image quality, there is nothing remarkable to say. The performance of the game hasn't changed much, and neither has the image quality.



X2: The Threat

Our other favorite non-FPS game, the space-habitation game X2 is a good look at what happens when environments are of secondary importance to a handful of high-quality models. With a great deal of shadowing and pixel shading, X2 can weigh down a system pretty hard at points even with its age.

X2: The Threat
X2: The Threat HQ


Other than a frame here and a frame there, there is no significant change in performance in X2, in line with what we saw previously on the X800 Pro. At over 70fps even at our HQ settings, though, we're not surprised NVIDIA didn't bother with any optimizations.


ForceWare 60.72 versus 84.21
Mouse over for 84.21


Once again, the lack of performance optimizations and the lack of a change in image quality go hand-in-hand.



Doom 3

As our main OpenGL benchmark, Doom 3 is NVIDIA's chance to shine with their traditionally superior OpenGL performance. With its emphasis on darkness and a unified lighting system, Doom 3 presents a very different situation than most first-person shooters do, hopefully giving us a different take on performance in such a game.

Doom 3
Doom 3 HQ


While we have kept all the results for Doom 3, our problems with the 60.72 drivers once again reared their head here. As you can see below, there was a small rendering difference among a wall edge between the 60.72 drivers and the other ForceWare drivers, where the edge is a bit underexposed. While if it wasn't for this specific screenshot we wouldn't have noticed them, there is obviously a difference in rendering output and the performance with the 60.72 drivers should be taken with a grain of salt as a result.


ForceWare 61.76 versus 84.21
Mouse over for 84.21


Checking for the image quality within the game, other than the slight issue with the 60.72 drivers, there are no further changes.



Half-Life 2

Here is another game where NVIDIA has at least started out with a performance disadvantage, as we noted in our Half-Life 2 GPU Roundup back in 2004. With the immense popularity of Half-Life 2 and the Source engine being licensed out for use in other games, this is another game where there is a good deal of pressure on NVIDIA to find a way to improve performance.

Half-Life 2
Half-Life 2 HQ


As you can see here, there is a pattern forming with the 60.72 drivers. We've had to once again had to drop the 60.72 results due to some rendering errors. The image is far too bright with the initial drivers, and there are other problems as well.


ForceWare 61.76 versus 84.21
Mouse over for 84.21


The 6800 Ultra once again comes up clean in the image quality comparisons. With blowing fog, these two screenshots are not perfectly alike, but looking at the other parts of the screenshot shows them to be essentially identical.



Battlefield 2

As the newest game in our latest investigation, Battlefield 2 is almost too new to include. With only a handful of drivers released in the lifetime of the game, there's a lack of data points to work with to draw a strong conclusion. Given the number of requests to include this game, however, we have tested it with all of the drivers released since its launch. As a primarily large multiplayer game, Battlefield 2 strikes an interesting balance between the desire for high quality graphics and the need to be able to render a large firefight without slowing a system to a crawl. Battlefield 2 also is a unique game out of everything that we've tested because it's the only game here that requires pixel shading, whereas everything else merely uses the ability if it's there.

Battlefield 2
Battlefield 2 HQ


Given the rapid rise to popularity that Battlefield 2 gained upon release, it's of little surprise that NVIDIA took a chance to optimize their drivers here. What is interesting is that in most games the largest performance improvements come with HQ settings as opposed to running with everything turned off, but that is not the case here. Instead we have a rare inversion where there is a greater improvement when AA/AF are turned off, which would further confirm BF2's status as a game heavily GPU dependant.


ForceWare 77.72 versus 84.21
Mouse over for 84.21


Our image quality tests continue to be a case of "no news is good news." There was no change in render output even with the change in performance.



3DMark05

As we continually note, 3DMark isn't something we normally use in an article due to its nature as a synthetic benchmark instead of being a real game. That said, it's an excellent diagnostic tool both for its wide customizability and the ability to render specific frames. However, it's also highly prone to being manipulated (both fairly and unfairly) due to the value some groups attach to it, so while it has little worth as a real-world gaming benchmark, it's a great indicator of just what kind of performance improvements a company can wring out of a video card when given the proper motivation.

We should note that because of NVIDIA's unusual driver release mechanisms, only a handful of the drivers tested here are actually Futuremark approved, whereas nearly every Catalyst driver we have tested has been approved. We do not have any reason to believe NVIDIA has been cheating in these cases, however.

3dMark 2005
3dMark 2005 HQ


The 60.72 drivers once again created a problem for us with 3DMark05, as attempting to run the benchmark would simply reboot our testbed. At this point we have some significant concerns over the quality of launch drivers in particular, but that's for another article. The 61.76 drivers have also been dropped, due to 3DMark05 not offering the ability to use the SM3.0 profiles with these drivers, and as a result the score would not be comparable with the other drivers.

Past that, we're particularly surprised with the results, as NVIDIA does not appear to implement any significant performance optimizations for a 3DMark. As we've mentioned before, with 3DMark's synthetic nature, we'd rather see NVIDIA and other GPU manufacturers put engineering resources into improving game performance rather than benchmark performance, and we're glad to see that this is the case.


ForceWare 66.72 versus 84.21
Mouse over for 84.21


Image quality once again remains unchanged, which is a good thing to see given NVIDIA's history with 3DMark03.



NV40 vs. R420

Now that we have gone through NVIDIA's efforts on a per-driver level, we finally have a chance to answer the burning question of the moment of where NVIDIA's overall driver-based performance improvement compare to ATI's. As we noted previously, ATI gave us some impressive results with the 6800 series' competitor, the X800 series, and even better numbers on the aging 9700 Pro.

With a brand new architecture, the expectation is that NVIDIA will have more headroom within the NV40 to improve performance, while ATI will have already tapped out much of the R420's headroom since it's a derivative design. NVIDIA also has the reputation of being the more dynamic of the duo with their drivers, so let's see how things go.

As with before, the following is a performance summary, using a performance factor between the first and latest drivers on our selected games to normalize against the innate differences between the X800 Pro and 6800 Ultra. For the NVIDIA drivers, we have started with the 61.76 instead of the 60.72 drivers, since the latter had problems with all of the selected games.

Overall Comparison


With AA/AF disabled, it doesn't come as a surprise that neither the X800 Pro nor 6800 Ultra post extremely large performance gains except for 3DMark, where we have previously seen ATI optimizing heavily for it. Doom 3 does stand out as the only game that posts even a moderate performance improvement with these advanced features off, favoring the X800 Pro and ATI's traditionally weaker OpenGL performance in this case. Battlefield 2 also shows a minor 10-12% improvement, with a slight edge to the 6800U. Overall then the X800 Pro does a bit better than the 6800 Ultra in total performance improvements, but it's not a very significant difference outside of Doom3.

Overall HQ Comparison


Looking at the performance difference between the two cards with AA/AF enabled, and the results are surprising to say the least. Once again we see ATI's massive performance improvements for 3DMark05 showing up, once and for all settling the question of who optimized for 3DMark more in this previous generation. Turning to the games, we see ATI further open the gap between itself and NVIDIA in Far Cry, with a 33% to 8% performance improvement on a game that ATI already leads.

Half-Life 2 and Doom 3 basically offset each other, with each side taking a win. As we mentioned previously, Half-Life 2 is a title where ATI has had a natural lead, so to see such an effort by NVIDIA is to be expected, though in terms of raw performance it has been a large gap to close at times. Similarly, ATI benefits the most under Doom 3, adding 28% versus NVIDIA's 18%. Finally, neither side manages to add more than approximately 8% to their Battlefield 2 scores, showcasing just how graphically punishing the game is among everything tested here, but also a bit surprising in that neither side has found a good way to boost their scores yet. BF2 also requires quite a bit of CPU processing power, and the pixel shader effects definitely put a strain on the GPU core - this can be seen by the slightly higher improvements seen under the "basic" settings as opposed to the advanced mode.

Overall then, there is a trend worth noting, however counterintuitive it is. While we have said that it is NVIDIA that traditionally makes the most of its drivers, this was clearly not the case with the previous generation. Whether it's a testament to what the Catalyst team can do versus the ForceWare team, a hardware generational difference, or both, when both the normal and high quality tests are factored in, ATI is the victor for getting the most out of its drivers.



The Wildcard: Console Ports

One area of increasing importance lately in gaming is the relationship between PC games and consoles games. Once greatly separated in abilities and game types, the PC and the console have been coming together in recent years with a number of titles being published for both the PC and one or more consoles. Since this influences game design towards a different direction than PC-centric design, we've rounded up 4 games from our usual test sequence that are all console ports, to see if these console influences offer anything substantially different from the normal ebb & flow of game performance.

Because these games aren't designed with the PC as the primary platform, benchmarking these games is a more limited and difficult affair than our other games. As such, we are only going to list the performance of the initial working driver and the latest driver for each game.

Serious Sam 2

First off is Serious Sam 2, Croteam's sequel to their immensely popular run and gun FPS. Unlike its predecessor which was first a PC game and then poorly ported to a console, this title was simultaneously developed for both the Xbox and the PC, which means it shows some of those aforementioned console influences.

Serious Sam 2
1024x768 0xAA 4xAA
ForceWare 78.01 82.2 56.9
ForceWare 84.21 83.7 58
Catalyst 5.09 56.3 23.3
Catalyst 6.04 72.7 41


Like most games designed with a console in mind, Serious Sam 2 while still a graphically impressive title is not terribly hard on our GPUs. However, looking at the ATI results, the performance improvement is very eye-catching, though it also highlights the initial disappointing performance. With NVIDIA providing the underpinnings of the original Xbox, it's hard to tell if this is a case where they have a natural advantage or if ATI simply was unlucky with the game initially, but at any rate it reinforces the importance of the impact of new drivers on freshly-released games.

The Chronicles of Riddick

Here is another game with roots originally in a console game, and in fact it was originally only a console game. However, with the PC version released some 6 months after the Xbox version, the Developers Cut offers enough differences from other titles that were simultaneously developed that it can stand apart.

The Chronicles of Riddick
1280x1024 0xAA 4xAA
ForceWare 66.93 36.83 21.26
ForceWare 84.21 37.6 21.5
Catalyst 4.11 20.11 9.11
Catalyst 6.04 23.22 10.88


In spite of the time difference between releases and the more PC-focused nature of the resulting game, we see an interesting pattern similar to what happened in Serious Sam 2. NVIDIA picks up nothing, while ATI picks up over 10% in order to partially close a fairly wide gap. This could be another case where performance favors NVIDIA due to the Xbox, in which case NVIDIA may be in for a rude awakening in the near future as more Xbox360 titles make the transition to the PC. (Xbox360 uses an ATI graphics chip, whereas the original Xbox used an NVIDIA graphics chip.) Of course, with PS3 also using NVIDIA hardware, it could be that Xbox360 ports will run better on ATI while PS3 ports will run better on NVIDIA; that's something we'll watch for over the coming year(s).

Need For Speed: Underground 2

Moving away from Xbox-only titles, Need For Speed: Underground 2 differs in that it was created for multiple consoles instead of being an Xbox exclusive. With a focus on vehicular mayhem, NFS offers a nice contrast to our other games. Here, a super high frame rate isn't quite as important as in FPS games, since fast changes in direction aren't quite as frequent. It will also be interesting to see how a "sim" compares to the more common FPS benchmarks.

Need For Speed: Underground 2
1024x768 0xAA 4xAA
ForceWare 66.93 62 49
ForceWare 84.21 62 49
Catalyst 4.11 58 38
Catalyst 6.04 59 39


As the only multi-platform console port on our list of games, it's very notable that NVIDIA does not have a massive performance lead at any point here, nor does ATI need to close any large gaps. While the framerate hovers around 60 FPS, the title is not locked at 60hz internally, and rather this is only an odd coincidence. Although the lack of a performance increase is a bit disappointing, the implications of the data aren't. As PC games become increasingly tied to console games, who's under the hood of the primary console may be having a fairly large impact when it comes to PC performance.

Final Fantasy XI

Rounding up our look at console games, we take a look at Square-Enix's MMORPG, originally released for the Playstation 2. With an emphasis on the number of characters in a scene over individual character detail and environmental detail, FFXI can be fairly punishing even with it's now dated graphics.

Final Fantasy XI
1024x768 0xAA
ForceWare 60.72 6472
ForceWare 84.21 6562
Catalyst 4.05 6336
Catalyst 6.04 6212


As neither ATI nor NVIDIA supply the graphics underpinnings of the PS2, it's not surprising to see both start out on equal footing. In fact, neither card deviates much from their initial score, indicating that the limiting factor is not the GPU in the first place, and hence there's little that either company can do to improve performance. However as ATI and NVIDIA will be supplying the GPUs in all 3 next-generation consoles, these scenarios are likely to be few and far between in the near future.



Conclusion

So now that we have seen what both ATI and NVIDIA could do with their respective drivers in the previous generation, and how this is apparently influenced by console development, let's first talk a bit about NVIDIA specifically on the subject of PC-native games.

Overall, the performance improvements with the 6800 Ultra are a bit disappointing. With a new architecture we had hoped NVIDIA would have been able to pull out some of their famous performance increases, and this did not happen. This is not to discount the 6800 Ultra, as the entire NV40-based family are all quite strong cards. However, part of the market is working out performance improvements over a card's lifetime in order to improve performance, win the mindshare of buyers by showing them that they're going to get more out of their purchases over time, and thus create better positioning for future products. With only a handful of significant (>10%) performance improvements, it's hard to say NVIDIA really has done this well.

Accordingly, they don't end up sticking to all of our previously listed trends that ATI ended up following. In general, NVIDIA was able to provide a little bit of a general performance improvement with each of their drivers instead of being solely focused around one-time performance boosts, but the most promising time period for a performance boost is still shortly after a game comes out. In fact, the biggest performance NVIDIA experienced appears to be in debugging and optimizing the drivers early in the life of the hardware. As with ATI's Catalyst drivers, there seems to be little reason to do a ForceWare driver upgrade unless there is a specific change in a driver you are interested in, and this likely isn't going to change with the NV40-derrived G70 series.

On the positive side, we reiterate how impressed we are with NVIDIA's lack of performance improvements in 3DMark. If this is truly because they didn't devote resources to doing so, or if they simply couldn't pull it off, is something we may never know, but it's nice to see that their greatest improvements were in real games, not synthetic benchmarks. Similarly, we are glad to see over the last year and a half that they haven't made the same mistake as ATI with regard to shoehorning users into needing to use a bloated toolset to manipulate their cards. The NVIDIA control panel is lean and mean, and that's the way we like it.

As for the burning question of performance improvements with drivers, ATI has clearly shown that they can work magic with their drivers if they need to. This in turn once again reiterates just how much of an impact drivers can have on overall performance, and ultimately purchasing choices. Although even with some of these large performance changes, they would not have likely changed our initial recommendations for what to purchase, it is close to that threshold.

It is clear that just looking at a card once at its introduction is not enough; the performance improvements and corrections offered by later drivers are just too much to ignore. Yet at the same time, other than re-reviewing a dozen cards every time a new driver is released - and even then we can't very well tell the future - we have no way of really measuring this impact other than with hindsight, which is by its very nature too late. We can offer guesses at what kind of performance improvements might be in the pipeline in the future from NVIDIA and ATI, and certainly they will make every effort to tell the world about how proud they are whenever they do manage to come up with such a performance improvement. The reality is that many of the proclaimed improvements are for certain uncommon settings - i.e. running a budget card at high resolutions with AA/AF enabled - and judging by the results of our regression testing, buyers should be aware of the marketing hype that's going on.

At the same time, the increase in games being ported or developed with console versions paints a very different picture. With the current generation consoles, NVIDIA has benefited from a nice performance lead from what we've seen today, as Xbox-only titles have been much harder on ATI than NVIDIA. Since NVIDIA is effectively the de-facto leader in console GPUs for the current generation (ATI only has a chip in the Gamecube, of which few titles are ported over to the PC), there appears to be little reason to expect major performance improvements out of NVIDIA's drivers on Xbox ports, while ATI appears to be able to work out (and need) performance improvements for Xbox ported games.

That said, the number of games that will still be ported from current generation consoles to the PC is dwindling, which brings us to the discussion of the next generation. ATI powers the Xbox360 with an R500-like chip and will power the Wii with what we believe to be an R300-like successor to the Gamecube GPU. Meanwhile, NVIDIA will be powering the Playstation 3 with a straight-up G70-generation chip. If our results from the Xbox ports are any indication, it seems like there will be a natural favoritism in the case of games that are exclusive to one console or another. Ultimately this may mean that the home team for a game will not be improving driver performance for those games due to an inherent lead, while the visiting team will need to try and make some optimizations to catch up. This is in stark contrast to the way PC-native games work, and cross-platform titles are still up in the air entirely. This facet of performance will be an interesting subject to watch over the next few years as all the next generation consoles come out.

In the mean time, getting back to ATI, NVIDIA, and the PC, what have we learned after today? We still can't predict the future of PC games, drivers, or hardware, but after looking at these results and their new track records, it certainly seems like with the introduction of the X1900 and 7900 series, that ATI may be in a better position to offer more substantial performance improvements than NVIDIA. As we've seen, NVIDIA hasn't been able to work a great deal of improvements out of the NV40/G70 architecture in our tests. With the R500 being quite different from venerable R300, we're interested in seeing what sort of improvements ATI can get from their new architecture. If ATI stays ahead in the driver game, they may very well have what it takes to beat out NVIDIA in the majority of games running on the current generation architectures.

Log in

Don't have an account? Sign up now