Early Xbox One X benchmarks provide a peek at its potential

The release of the Xbox One X this fall will herald the arrival of the most powerful console on the market with its 6 teraflops of GPU power. We'll have to wait to get a comprehensive look at what that means for the likes of 4K gaming and increased fidelity, but a new report from Digital Foundry at Eurogamer offers a tantalizing look at some early benchmarks that provide a glimpse at the console's potential.

As explained by Digital Foundry's Richard Leadbetter, the benchmarks cited in the report, sourced from developer contacts, are early and represent rudimentary ports of existing games to the Scorpio engine, pushed up to 4K, and without optimizations in place to take advantage of any new hardware features. And while the nine games listed in the benchmark examples aren't listed by name, Leadbetter was able to make some educated guesses on the identity of a few based on the details provided (engine type, target resolution, genre, etcetera).

Despite the lack of optimizations for the new hardware, the early benchmarks show some interesting results. The title that is most likely Forza Motorsport 7 (Title B), for example, was able to achieve 4K at around 91 fps, which translates to using 65.9 percent of available GPU power at its cap of 60 fps. That leaves plenty of headroom for further visual enhancements.

Xbox One X Benchmarks

Slightly less impressive is what is assumed to be Gears of War 4 (Title C), which achieves 38.5 fps at 4K, which translates to using 78.1 percent of available GPU power at a cap of 30 fps. As Leadbetter points out, however, this could be an indication of these early, basic benchmarks underestimating final results. Gears of War 4 on Xbox One X, for example, is expected to feature higher-resolution textures and a number of other visual enhancements.

Digital Foundry also provided a bit more insight into the architecture of the console's GPU:

There's more too, based on the documentation we've seen. The fundamental architecture of the Xbox One X GPU is a confirmed match for the original machine (believed to be the case for PS4 Pro too) with additional enhancements. There are other features, including AMD's delta colour compression, which sees performance increases of seven to nine per cent in two titles Microsoft tested. DCC is actually a feature exclusive to the DX12 API. In fact, DX11 moves into 'maintenance mode' on Xbox One X, suggesting that Microsoft is keen for developers to move on. There are benefits for both Xboxes in doing so - and there may be implications here for the PC versions too. We could really use improved DX12 support on PC, after all.

Some titles listed in the benchmarks displayed some performance issues which would likely be rectified by optimizations at the engine level for the Xbox One X's hardware features. Considering the results are based on the most basic of ports, the net results are fairly promising (with the exception of a single outlier on the chart above) for Xbox gamers looking to make the jump to 4K.

For more, Eurogamer's full, in-depth report is definitely worth a read if you're considering moving to the new console. The Xbox One X will launch on November 7 starting at $499.

What to expect from 'Xbox One X Enhanced' games

Dan Thorp-Lancaster is the former Editor-in-Chief of Windows Central. He began working with Windows Central, Android Central, and iMore as a news writer in 2014 and is obsessed with tech of all sorts. You can follow Dan on Twitter @DthorpL and Instagram @heyitsdtl

67 Comments
  • Anyone who thinks 'still' that the One X won't take on high end visual performance of a PC running a 1080ti with 8gb ram is kidding themselves. The design of this console is astonishing. And I still don't know how they achieved 12gb ram at 326gb/sec!!!!!!!!!! Seriously at £449 that's down right unbelieveable. Especially with 4K Bluray player in there as well.
  • Are you really that blind or do you make yourself dumb on purpose?
    The One X has little more computing power than a RX480. A computer with a normal 1080 crushes it to smithereens let alone the 1080i.
  • Talking about stupidity... He referred to visual performance, and you are talking about raw computing power. See what's wrong here? The only thing that a GTX 1080 crushes is your credit card.
  • "Crushes is your credit card" Good one!
  • lol.
  • Crushes? Lets use Gears of war 4 as an example since it's out and benchmarks have been done on it. With a 1080 at 4k it averages 45FPS and 57 with the 1080Ti. If we assume 38.5 FPS is correct for it for the Xbox1X, 6.5 FPS is hardly crushing it. Especially when they state there is 20% of the GPU left to tap into. 
  • And further still those benchmarks for One X as DF explain are straight Xbox One code ports. With absolutely no Optimization and using none of the new hardware improvements.
  • That's exactly what I eluded to. I can only use numbers that exist, since I don't have a machine to benchmark the X1X versions. Using the numbers provided her, a GTX1080 in a PC barely does anything more than the early X1X benchmarks for Gears 4 and still has more room to spare. I'm willing to bet the 1080Ti will still beat the X1X, but just barely, and certainly not in the bang for your buck category.
  • You do know the quality setting on xbox one was rated to be running on a pc equivalent at about medium right?
  • Yep! Everyone seems to over look that.
  • wtf https://www.youtube.com/watch?v=ICXaYogZH6M  
  • I'm not even gonna get into it with you. All I'm gonna say is YouTube is not a reliable source, nothing on there proves his settings, and finally the 1070 isn't even a part of my comment.
  • Ohh its you again (pic) Never mind      
  • It's not my fault you can't source information or form an argument.
  • Add to your playlist
    https://www.youtube.com/watch?v=hiRacdl02w4
  • That's actually multiplayer. I believe single player is 30fps on the original xbox one and 60 in multiplayer.
  • Bruh this is unoptimized. Just a simple uprez. Optimizations are what make it sing. So what's your point? This is litwrally just changing the res from 1080p to 4k and nothing else. OF COURSE THERE WILL BE PERFORMANCE ISSUES.
  • Read my comment again. I simply pointed out to the person I replied to that the 1080 is not 'crushing' anything. I said nothing about performance issues, or anything that needs to be done. If you even bother to read the end of my comment I point out how there is still optimization to be done since there is a percentage of the GPU left to use...bruh
  • Clearly you haven't got a single clue how GPUs work. A 1070 GTX Geforce card has 8gb Vram running at 250gb/sec bandwidth. And has 6.49terraflops. The One X has 6.12 terraflops with 9gb dedicated Ram running at 326gb/sec bandwidth. Now add to the fact that PC GPUs do NOT run at maximum output like a console ever. Your more than likely (you would have to do exact research) be getting about 5TF performance out of the 1070GTX on a PC. On a 1080TI with 8TF your likely getting around 6TF. This is common knowledge to ANYONE who builds gaming PCs. There's even a developer on record suggesting the One X performance is more skin to that of a PC running a 1070GTX with 16gb VRAM. Here is what a developer working on One X and PC has said last week. : "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499". Read more: http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ra...
  • yes- just like gtx1070:)))))))))))))))))))))))))))))))))))))))))))))))))) https://www.youtube.com/watch?v=ICXaYogZH6M  
  • The numbers and specs are clear for EVERYONE to read. Posting a random Youtube video does NOT change the fact on paper the One X species beats the 1070GTX. And that's before you factor in the Overhead on PC running OS etc. MS One X specs outpace Nvidias own specs for 1070 GTX with 8gb VRAM. The memory is to underpowered in comparison to One X for it to better it. That's a fact. Unless you claiming Nvidias lying about the 1070GTX specs?
  • It's not even worth trying with him. His argument is based on YouTube and Ad Hominem
  • And yours based on phil spencer and satya nadella..
  • Actually its based on Nvidias own specs vs One X official specs. Nothing more nothing less. All confirmed by Digital Foundry.
  • If youre youre going to represent the (Microsoft)PC master race, do it with common sense please. They have already subsidized the cost on a grand scale, something that a PC cant do and now they are tapping on the front door of a 1080 setup...
  • Again, you're posting multiplayer videos. This is single player which is in the 40fps range. https://www.youtube.com/watch?v=8I3F_r3h5H0  
  • Depend on the settings.. They running it on Ultra with everything on (even v sync enabled) Just lower shadows to Very High/High and lower the anti aliasing (that not needed in 4k) and you will get 60fps avg
  • No, the original game is running at 60fps on the original xbox one in multiplayer so it's obviously not as intensive as single player and these benchmarks presented here are for single player. Nice try though.
  • learn http://lifehacker.com/5985304/get-the-most-from-your-games-a-beginners-g...  
  • No you need to learn. You are the one that stated the only way to tell is with actual bench tests "The long and short of it is that you can only really tell if one card is better than the other by actually using a real game to compare whether one card performs better and even then that is highly dependent on how well optimized the game is for either card ." Obviously you don't have the rig to test yourself or you would post your own stats, so you go to people on YouTube (and you can't even choose videos that properly support your statement of 60fps) Here are some Gears 4 bench tests http://www.pcworld.com/article/3128346/software-games/tested-gears-of-wa... And some more http://www.legitreviews.com/gears-war-4-pc-performance-benchmarks_186780/3 And even more http://www.guru3d.com/articles_pages/gears_of_war_4_pc_vga_graphics_perf... Nothing in there shows that a 1070 can hit 60FPS at 4k, yet an unoptimized X1x version can pull about the same frames. So stop with turn this off turn this on and youll get 60 unless you can provide the stats of the actual game with the settings you claim.
  • Are you still alive? Just checking.
  • Axmantim and Richard loveridge are from the xbox team (microsoft studio?)...ignore the downvotes....
  • Specs don't lie. Why would anyone disagree with factual information???
  • Graphics card performance figures are a boggy minefield of confused and conflicting information.On the 6870 The memory clock figures are probably equivalent in that the AMD is listed as 1 real GHz (, while the Nvidia is 4GHz "effective clock" with a 1GHz real clock that is "quad pumped". Chances are the AMD is quad pumped too so the actual memory bandwidth figures are exactly the same. The other problem comparing them is as you say the FLOPS figure, which again is pretty pointless to use to compare AMD (XBOX) to Nvidia as they are very different architectures and perform differently. The FLOPS figure is highly dependant upon performing very particular tasks which are not necessarily the same tasks used to generate screen graphics. They are also do not take into account the benefits of the specific architectural enhancements that may make one card do specific operations faster than the other, again which card does what better is nearly impossible to quantify. The long and short of it is that you can only really tell if one card is better than the other by actually using a real game to compare whether one card performs better and even then that is highly dependent on how well optimized the game is for either card .
  • Actually that's not true. If your talking about PC CPUs you are correct. However with GPUs the purpose is only visuals. Clock speeds and APUs are there to achieve Floating Point operations peaks. Various GPUs use different APU methods to achieve as high a Flop performance as possible. Which is why Clock speeds are not what you loom at as far GPUs are concerned. It's the Flop performance that gives you its ball park. But that's only half the story. An RX580 has a good Flop performance 5.8. But its memory is terrible. Hence why that card cannot do 4K anywhere close to a 1070GTX/1080TI or the One X. That's why comparing a One X to an Rx480/580 is plain laughable and wrong. And either shows a lack of complete understanding or is simply Sony fans throwing around misinformation to make themselves feel better.
  • The Gtx1070 is 15-30% faster then AMD fury x Amd fury x spec  https://www.google.co.il/search?q=FURY+X+SPEC&safe=off&client=firefox-b&...  
  • I'm not quite sure you know what your doing. Or your just blatantly being ignorant. What are you comparing the Fury X to the 1070GTX???? The Fury X has a tiny 4GB Vram. The 1070GTX has double the VRAM. It's completely obvious the 1070GTX is going to beat a card with half the available memory.
  • First Fury x =  HBM Memory Gtx1070 is also Faster on 1080p (Around 2gb of mem) "First, there’s the fact that out of the fifteen games we tested, only four of could be forced to consume more than the 4GB of RAM. In every case, we had to use high-end settings at 4K to accomplish this. Of the three games profiled in this article, two of them were GameWorks titles with heavy, Nvidia-specific optimizations and did not run well on AMD hardware, even at resolutions where RAM was no issue" https://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fur...        
  • I know what type of memory Fury X has. You seem to read something and not understand it. That article you posted is 2015!!! And as a result APIs come into play. And the 1070GTX is not even available for comparison. Also to note Vulkan was bot available to Fury X. This is further the problem with PC gaming. As opposed to Console. And further proves the point as to why One X handily outclasses an 8gb 1070 GTX. https://tech4gamers.com/radeon-r9-fury-x-25-faster-than-geforce-gtx-1070...
  • Tweaktown http://www.tweaktown.com/tweakipedia/116/fury-vs-gtx-1070-battlefield-dx... https://www.youtube.com/watch?v=-t549QciGAM and also 3dmark https://videocardz.com/60265/nvidia-geforce-gtx-1070-3dmark-firestrike-b...
  • Battlefield uses more than 4gb ram. And we are back to a circle. 1070GTX will always outdo Fury X in 4K for example because 4Gb is not enough for 4K. Any game that has Vulkan available will run better at 1440p or lower on a Fury X over 1070GTX with Vulkan. One X is a 4K system as a result One X will run all games better than a 1070GTX with high settings in 4K. Fact. It's really not that difficult to grasp.
  • http://comicbook.com/gaming/2017/07/05/xbox-one-x-benchmarks-revealed-ho... Can you guess what the X1X's main limitation will be? I'll give you a hint: it's the same thing holding back the PS4 Pro, and the thing that has typically been holding consoles back every since we started trying to target 1080p 60fps performance. That's right: the CPU. The Xbox One X will still see some formidable bottlenecks. It's one thing to throw around a 6 TFLOP performance potential and guess at which games will run at 4K and marvel at how great they look in our imaginations, but we have to keep in mind that there will be a lot of humongous, open-world, third-party games that aren't running on an engine tailored to take advantage of everything the X1X has to offer, and frame-rate will always be CPU-bound. Do not expect games like The Witcher 3, Assassin's Creed Origins, or Anthem to run at native 4K 60fps. In fact, I'm not even sure we can count on Assassin's Creed running at 2160p at a stable 30fps without some tricks of the trade. The Digital Foundry video mentions Assassin's Creed Origins specifically, revealing that the game will utilize dynamic resolution and checkerboarding in order to reach its 2160p target resolution. This isn't such a bad thing, mind you, but it's worth keeping in mind so we don't get inflated expectations https://www.gamespot.com/articles/xbox-one-x-performance-detailed-native...   One thing is proved for certain, it's incredibly difficult for games to run 4K at 60 FPS. As a consequence, we'relikely to see graphically demanding games use dynamic resolution or checkboarding techniques to achieve higher resolutions, as is the case with the PlayStation 4 Pro. Not every game will run full native 4K, and fewer games will do so at 60 FPS. . xbox (polaris based gpu & cpu)<<<<fury x<<gtx1070<<<<<<<<<<<Vega/Volta
  • Exactly this.
  • Lol you won
  • No...
  • Xbox One X CPU is 40% more capable than Xbox Ones. It's been customised by MS. Xbox One 2013 actually had a better CPU than the original PS4 as well. Something not many people know. These benchmarks show with no work whatsoever the Xbox One X runs any existing Xbox one game in 4K without any optimization AT ALL. Developers who are Optimizing are getting amazing results. For example Ark Survival running on One X exactly the same as a PC with 1080TI. F1 2017 running 4K with the highest PC graphics settings. Titanfall 2 actually running 6K @ 60fps. For comparison the PS4 Pro can only run Ark Survival at 720p 60fps and medium PC settings. That's the difference here. One X has been designed as a powerhorse. Matching $1500 PCs. You can make baseless claims off of Youtube videos if you want. Specs are their to see. Developers are now reporting just how powerful the Box is. And we are seeing that with their games matching high end PCs. As they are hitting 4K with ultra PC settings in a variety of genres. Open world multiplayer, racing and First Person shooter. Bethesda also announced that they are also upgrading all their games to 4K and high pc settings on One X. They have come out and praised the One X as well. I don't know how much clearer it can be. Using Ubisofr after they made it clear they are using Ps4 Pro code on Creed One X is poor from you. Literally i hope you don't work in the industry. Because you don't know what your talking about. At least DF will prove you wrong week in week out every time a new games hits One X and matches PC high end.
  • Lol..you changed your initial position of its not a 1070 to it won't run 4k and something about a 6870....
  • 6870 this is just an example
  • It's not what you've got, it's what you do with it. At the end of the day the battle will be won by whoever optimizes best.
  • Thats pretty much irrelevant, because in actual games a console performs much better than a similar specced PC. Not sure if you make yourself dumb on purpose.
  • A little more than an rx480 ahahahahaha you're a clueless clown bro as XboxOneX destroys even two rx480 cards in crossfire as they're not even close.    A REGULAR GTX 1080 doesn't compare or compete with the XboxOneX as it's nowhere near as powerful because a console will always provide twice or double the performance of identical paper spec Pc Hardware, and so the XboxOneX 6.2 teraflops of GPU compute power are the equivalent of 12.4 teraflops of Pc GPU compute power that's just the way it works, and so yea like I said regular GTX1080 doesn't stand a chance against the XboxOneX as this consoles direct competition is the GTX1080ti or TitanXp that's just the way it is, and I'll be glad to back that up with facts, and overall explanation of how it's possible what I've said is the truth.     Check out the YouTube link for the Ark Survival Developer Interview of which explains how Xbox One X can run 60fps Pc Epic Settings while a Ps4 Pro can only manage 30fps medium Settings n the vastly superior xbox one X cpu is a big part of why it's possible for 60fps gaming on Xbox One even at the ultra highest Pc Epic Settings of which not even an ultra high end Intel cpu with GTX1080ti is able to achieve as that ultra high end Pc setup averages about 45fps at 1080p with Pc Epic Settings n that's not a joke that's the absolute truth........           THE XBOX ONE X GPU has a theoretical 6.2 teraflops of compute power which like I said it's the equivalent of roughly 12.4 teraflops of Pc compute power based on how consoles are able to allow devs to gain 100% unrestricted to the metal ultra low level access to the power of a console chipset all within a closed loop system, while a Pc for example an ultra high end GTX1080ti might have 11.3 teraflops of power but Pc developers can only access, and or utilize roughly 60% of that power which means their only getting 6.7 teraflops of compute power from an 11.3 teraflop GTX1080ti, and that's because Pc development doesn't allow for ultra low level to the metal access to 100% of the chipsets power, and when you take all this into consideration you're able to finally understand that when devs are comparing the XboxOneX to the equivalent of a $1500 to $1800 dollar high end Pc their totally correct as they're not bullshitting anyone, and the truth is being leaked out every day BUT YA GOTTA REMEMBER THIS IS REALLY ******* OFF THE PC MASTER RACE COMMUNITY BECAUSE 90% of them all have medium to medium high end hardware of which will get absolutely wrecked by the true power n potential of XboxOneX  
  • The difference is that we are talking about "today's" PC. Not "tomorrow's upgraded PC". Console always have taken on more expensive PCs at the time of launch, but they date quickly. The new and heavily subsidized Xbox compares well now but in the future it will lag far behind because PC camera constantly upgrade their hardware while console gamers are limited to software patches. It was the same with the PS4. It killed at release but lagged after that because it wasn't upgradeable. This is just the way of the world.
  • Actually that's where you're wrong again because unlike Pc hardware which performance decreases n depreciates over time It's the exact opposite for a game console, AND ESPECIALLY ONE LIKE XBOX ONE X WITH A FULLY CUSTOM, highly modified, and insanely over engineered chipset of which this console won't even hit it's true potential for 18 to 24 months after launch as that's when developers will really get accustomed to the fully custom hardware n fully custom architecture where they'll e able to take full advantage of its true capabilities, GAME CONSOLES ACTUALY GET FAST N INCREASING PERFORMANCE OVER TIME as opposed to Pc hardware which only decreases over time, and so yea  I disagree with what you've said because the XboxOneX true power n potential hasn't even been tapped by any of the currently announced games E3 2017 titles such as Anthem or AssAssins Creed n that's because those games have long been in development well before XBoxOneX was even an idea, and so their engines were built n designed around current gen standard consoles of which are extremely limited in terms of power, and so that's why people gotta understand all the games that are launching over the next 18 months have already been developed on the standard consoles then ported across to XboxOneX, but wait until after the system launches November 7th, then that's when the XboxOneX will become the lead development platform where games of the future will be built from the ground up to take full advantage of the hardware n that's when well start to see mass scale 3rd party Native 4k 60fps games for the system because they'll be developed using XboxOneX as lead platform then scale back n across as ports to the standard Xbox one , and that's when we'll start seeing just how crazy powerful this system really is as it's gonna challenge at the top of the Pc hierarchy for years to come as XboxOneX will continue to get more n more powerful as the years pass on, and no worrues THERE WILL NOT BE ANY NEW PS4 or NEW XBOX any earlier this than 2021 holiday at the absolute earliest that's just the way it is 
  • I could careless about to if its under 30fps -_- 30fps is too slow as it is in ways.
  • Truly the best value in gaming. I'd get one this fall if I could afford one. I'll have to wait a couple of years to buy it.
  • And let the continuing stupid comparisons to a PC commence. There is no way to make a apples to apples comparison. The silicone in the XBOX One X is so custom and optimized for one purpose (gaming) that comparing it to a PC is silly.
  • Although developers who work on these machines are actually coming forward. : "If you think about it, it's kind of equivalent to a GTX 1070 maybe and the Xbox One X actually has 12GB of GDDR5 memory. It's kind of like having a pretty high-end PC minus a lot of overhead due to the operating system on PC. So I would say it's equivalent to a 16GB 1070 PC, and that's a pretty good deal for $499". Read more: http://www.tweaktown.com/news/58011/ark-dev-xbox-one-pc-gtx-1070-16gb-ra...
  • Actually it's like having a high end graphics card in a PC, the actual CPU part of it is pretty ho hum. But that matters a lot less here.
  • The CPU isn't ho hum at all dude    The single biggest damage control statement from these pissed off Pc master Race jerks is oh but it's the same as the PlayStation 4 Pro because it's got the same CPU but that's totally not true.   The PlayStation 4 Pro CPU is absolutely bottlenecked for sure but the XBOX ONE X CPU is totally different dude ok seriously I'm not sure why people keep trying to say it's the same CPU because it's absolutely not as the new Xbox one X CPU is a revamped fully custom engineered evolved form of jaguar which also has attributes of Ryzen baked into the CPU at the hardware level, and also it's got the added benefit of full dx12 baked in at the hardware level also ok not just some software version but rather a hardware implementation..........     On the CPU side, there's been much conjecture that Scorpio would feature AMD's new Ryzen technology - something we thought unlikely, owing to manufacturing timelines, not to mention Microsoft telling us last year that the new console would feature eight CPU cores. All signs point to the upclocked Jaguar cores we find in Xbox One, and Scorpio's CPU set-up is indeed an evolution of that tech, but subject to extensive customisation and the offloading of key tasks to dedicated hardware.   Loading... hold tight! "So, eight cores, organised