‘Watch Dogs’ PC Benchmark Results — Multiple AMD And Nvidia Cards Tested – Forbes
Over the weekend I had the opportunity to test Ubisoft’s PC version of Watch Dogs, a highly anticipated open world game which takes advantage of Nvidia’s GameWorks toolset. In addition to observing the dramatic gulf in performance between AMD and Nvidia graphics cards, I was able to benchmark a wide range of newer GPUs from each respective company. The conclusion? Rendering Watch Dogs‘ near-future version of Chicago is going to require a ton of horsepower.
The benchmark routine itself was a custom 75 second run involving a short cutscene in an outdoor environment, followed by running a set path along a busy road with Aiden’s profiler active. As such, this benchmark incorporated things like ambient lighting, reflections, smoke, and tessellation. It did not reflect more intensive situations like explosions and car chases as those are difficult to replicate with any consistency. You should treat these benchmark results as the best-case scenarios with the understanding that your performance will vary based on CPU, graphics card manufacturer, and other factors.
As always, here are the core specs for my testing rig:
- Intel Core i7-4770K at stock 3.50GHz
- 16GB Corsair Vengeance LP RAM at 1866MHz
- Windows 8.1
- All tests used Nvidia’s Game Ready 337.88 driver and AMD’s upcoming 14.6 Beta driver.
The hardware I tested was of course limited to stock (and time!) on hand, but I tried to capture a sample representative of midrange to very high-end setups. Please note that for most of these tests, I’ve used the lowest shared method of anti-aliasing available to both Nvidia and AMD owners. (TXAA, for example, is a proprietary form of AA only available on Nvidia GPUs.)
If you’re not after extreme eye candy, users with lower end cards like Nvidia’s $149 GTX 750 Ti and AMD’s $119 Radeon 260x should be covered at the Medium quality level, although a smoother 60fps will be out of reach with any amount of anti-aliasing activated.
What’s particularly fascinating to me is the nonexistent performance gains in the high-end GPU space. A mere gain of 6 frames per second from GTX 770 to GTX 780 Ti. Zero performance gain between Radeon 280x and 290x.
Running the benchmark on High quality and textures replicated one of the anomalies and introduced a disturbing new one. Note the same non-existent performance gain between Radeon 280x and 290x, although thankfully the 780 Ti is flexing its muscle more efficiently here.
Efficiency, however, flies right out the window for SLI and CrossFire users. Got another $550 290x? That will net you an FPS gain of -1! Drop another $700 GTX 780 Ti into the equation and you’ll see an underwhelming 8% performance gain.
Digging Deeper: Why ‘Watch Dogs’ Is Bad News For AMD Users — And Potentially The Entire PC Gaming Ecosystem
When we push the graphics quality settings and textures to Ultra, that pattern remains established. Look at the results between 280x, 290x, and Crossfire 290x. Note the same marginal performance gain when ramping up from one to two 780 Ti’s. Signs of a horribly optimized game are clearly emerging. If it’s performing this poorly with high-end hardware at 1080p, 1440p and 4K users are going to wildly disappointed…
1440p & 4K Testing
My 1440p and 4K testing wasn’t as exhaustive, but it proved one thing: Both SLI and Crossfire users are completely out of luck. I’m confident all parties involved will continue to improve driver support at this level, but as of launch day with release-day drivers, it’s a dire situation. Nvidia’s 780 Ti turns into an unplayable, artifacting mess at 1440p using any level of MSAA. And dual 290x’s in CrossFire only turn in about 36fps on High quality. Obviously both cards are capable of so much more.
Curiously, using Nvidia’s proprietary TSAA, my 780 Ti SLI configuration yielded a paltry 13fps average, with a minimum of zero frames per second. The unusually high Video RAM requirements of Watch Dogs brings any card with 3GB of VRAM to its knees at 1440p and higher.
The only bright spot in my 4K testing centered around High quality and textures, with FXAA turned on as it seems the only level of anti-aliasing that won’t cripple Nvidia at higher resolutions. Using the 780 Ti in 2-way SLI, I nabbed a minimum 28fps and average fps. My test with dual Radeon 290x cards in CrossFire turned in the exact same result.
I haven’t been able to wrap my head around the technical reasons why the game is performing so poorly and often times erratically, but I’m certain it isn’t properly optimized even for Nvidia hardware — surprising since Nvidia is the official Watch Dogs partner. This game looks fantastic, but it isn’t on the same level as something like Metro: Last Light, which turns in drastically better numbers on the same hardware.
In the end, Ubisoft, AMD, and Nvidia all need to work diligently post-launch to deliver the kind of performance PC users expect out of their hardware — though that process may be more difficult for AMD.