Fable Legends Early Preview: DirectX 12 Benchmark Analysis
by Ryan Smith, Ian Cutress & Daniel Williams on September 24, 2015 9:00 AM ESTDiscussing Percentiles and Minimum Frame Rates
Continuing from the previous page, we performed a similar analysis on AMD's Fury X graphics card. Same rules apply - all three resolution/setting combinations using all three system configurations. Results are given as frame rate profiles showing percentiles as well as choosing the 90th, 95th and 99th percentile values to get an indication of minimum frame rates.
Moving on to the Fury X at 4K and we see all three processor lineups performing similarly, giving us an indication that we are more GPU limited here. There is a slight underline on the Core i7 though, giving slightly lower frame rates in easier scenes but a better frame rate when the going gets tough beyond the 95th percentile.
For 1080p, the results take a twist. It almost seems as if we have some form of reverse scaling, whereby more cores is doing more damage to the results. If we have a look at the breakdown provided by the in-game benchmark (given in milliseconds, so lower is better):
Three areas stand out as benefitting from fewer cores: Transparency and Effects, GBuffer Rendering and Dynamic Lighting. All three are related to illumination and how the illumination interacts with its surroundings. One reason springs to mind on this – with large core counts, too many threads are issuing work to the graphics card causing thread contention in the cache or giving the thread scheduler a hard time depending on what comes in as high priority.
Nevertheless, the situation changes when we move down again to 720p:
Here the Core i3 takes a nose dive as we become CPU limited to pushing out the frames.
141 Comments
View All Comments
tackle70 - Thursday, September 24, 2015 - link
Nice article. Maybe tech forums can now stop with the "AMD will be vastly superior to Nvidia in DX12" nonsense.cmdrdredd - Thursday, September 24, 2015 - link
Leads me to believe more and more that Stardock is up to shenanigans just a bit or that not every game will use certain features that DX12 can perform and Nvidia is not held back in those games.Jtaylor1986 - Thursday, September 24, 2015 - link
I'd say Ashes is a far more representative benchmark. What is the point of doing a landscape simulator benchmark. This demo isn't even trying to replicate real world performancecmdrdredd - Thursday, September 24, 2015 - link
Are you nuts or what? This is a benchmark of the game engine used for Fable Legends. It's as good a benchmark as any when trying to determine performance in a specific game engine.Jtaylor1986 - Thursday, September 24, 2015 - link
Except its completely unrepresentative of actual gameplay unless this grass growing simulator.Jtaylor1986 - Thursday, September 24, 2015 - link
"The benchmark provided is more of a graphics showpiece than a representation of the gameplay, in order to show off the capabilities of the engine and the DX12 implementation. Unfortunately we didn't get to see any gameplay in this benchmark as a result, which would seem to focus more on combat."LukaP - Thursday, September 24, 2015 - link
You dont need gameplay in a benchmark. you need the benchmark to display common geometry, lighting, effects and physics of an engine/backend that drives certain games. And this benchmark does that. If you want to see gameplay, there are many terrific youtubers who focus on that, namely Markiplier, NerdCubed, TotalBiscuit and othersMr Perfect - Thursday, September 24, 2015 - link
Actual gameplay is still important in benchmarking, mainly because that's when framerates usually tank. An empty level can get fantastic FPS, but drop a dozen players having an intense fight into that level and performance goes to hell pretty fast. That's the situation where we hope to see DX12 outshine DX11.Stuka87 - Thursday, September 24, 2015 - link
Wrong, a benchmark without gameplay is worthless. Look at Battlefield 4 as an example. Its built in benchmarks are worthless. Once you join a 64 player server, everything changes.This benchmark shows how a raw engine runs, but is not indicative of how the game will run at all.
Plus its super early in development with drivers that stil need work, which the article states that AMD's driver arrived too late.
inighthawki - Thursday, September 24, 2015 - link
Yes, but when the goal is to show improvements in rendering performance, throwing someone into a 64 player match completely skews the results. The CPU overhead of handling a 64 player multiplayer match will far outweigh to small changes in CPU overhead from a new rendering API.