Extra test: 8GB versus 4GB, there is some benefit
Based on the tables in the previous pages, one could conclude that the 8GB Radeon R9 290X is overkill and has no use whatsoever. Even though this is true in most cases, there are scenarios where this is not the case.
The open-world of Watch Dogs is one of the most striking examples of a game with a lot of textures where 4GB is a bottleneck. As the game is open-world and the path one takes is not set in stone, the engine of the game might need a set of textures quickly that hasn’t been stored in the GPU’s memory yet. This is where a difference can be made.
In our default Watch Dogs benchmark (as shown on the previous page), we walk the same small part of the city every time, at a leisurely pace. As an extra test, we got on a motorcycle and drove through the city at great speed. This inevitably created a situation where the game suddenly required a lot more textures.
In this extra test we calculated the render times of these individual frames. See the graphs below for the results:
1920x1080 Ultra 4x AA
3840x2160 Ultra 4x AA
As you can see, there’s hardly any difference in Full HD resolution: the 4GB-version has an average of 44fps, the 8GB-version has an average of 42fps. The 99th percentile frame time is pretty much identical: 52 ms.
Running the test in Ultra HD, we do see a bigger difference however: the 4GB-version has an average of 9fps where the 8GB-version has an average of 16fps. The 99th percentile frame time is 373 ms in the 4GB card where the time in the 8GB card is 139 ms. But let’s be honest, both are virtually unplayable.