Intel XeSS steps into the ring alongside AMD FSR and NVIDIA DLSS, new details revealed

Xess 4k Screenshot
Xess 4k Screenshot (Image credit: Intel)

What you need to know

  • NVIDIA has DLSS 2.0, AMD has FSR, and soon, Intel will have XeSS.
  • All these acronyms represent upscaling solutions that use technological sorcery to maintain the visual luster of a game while also boosting framerates higher than what would otherwise be possible.
  • Intel's XeSS doesn't have a firm release window yet, but the more information Intel gives us about the tech, the closer we likely are to knowing when it will be available.

The 2022 Game Developers Conference (GDC) is going on right now and lots of big tech companies are providing details about their latest projects. Among the participants is Intel, which is showing off new XeSS content and sharing fresh factoids.

Among the nuggets of new info on XeSS is that it aims to allow upscaling from 1080p to 4K and provide cross-platform functionality to games on a generic basis so that it doesn't need to be specially optimized to work with specific titles. Its SDK will be open source as well.

The upscaler will come in five flavors: Ultra performance, performance, balanced, quality, ultra quality. Ultra performance will have the maximum scaling factor of 3.0, while ultra quality will be all the way down at 1.3. It's said that XeSS will be able to top out at higher scaling ratios than DLSS and FSR. Naturally, NVIDIA and AMD still have advantages of their own.

Xess Intel

Source: Intel (Image credit: Source: Intel)

You can check out the nitty-gritty technical details of XeSS over at Intel's site, wherein it has lots of content exploring all the ins and outs of its invention.

If you're wondering what NVIDIA Deep Learning Super Sampling and AMD FidelityFX Super Resolution are (since they're a big part of the discussion around Intel XeSS and what it brings to the table), they're similar framerate-boosting solutions with their own unique designs, perks, and drawbacks. You can learn more at our AMD FSR vs NVIDIA DLSS breakdown. And note that AMD FSR just got a showcase for its FSR 2.0 upgrade that'll fully support Xbox Series X & S, so get ready for the supersampling competition to heat up.

Robert Carnevale is the News Editor for Windows Central. He's a big fan of Kinect (it lives on in his heart), Sonic the Hedgehog, and the legendary intersection of those two titans, Sonic Free Riders. He is the author of Cold War 2395. Have a useful tip? Send it to robert.carnevale@futurenet.com.

6 Comments
  • The OP doesn't mention what GPUs can support XeSS upscaling so here goes: Intel Arc Alchemist (Xe-HPG)
    Intel Xe-LP integrated graphics (11th-generation mobile)
    AMD RX 6000 (RDNA 2)
    Nvidia RTX 30-series (Ampere)
    Nvidia RTX 20-series (Turing)
    Nvidia GTX 10-series (Pascal) Of particular interest: RDNA2. Which means XBOX for sure and PS5 maybe.
    Intel cards will have specific Machine Learning hardware to support XeSS and XBOX X|S have (different?) ML hardware. The next question is thus whether XeSS on XBOX can use that ML hardware. Particularly relevant to the Series S. Now to see who else comes up with their own upscaling solution. Epic? Microsoft? Sony? MS might add theirs to DirectX.
    Hmmm...
  • FSR2.0 has been added to the Xbox GDK. Like 1.0 its opensource under MIT license. Supposedly it takes less than 3 days for studios to implement in DLSS working titles. Doubt Sony has much capital left to spend on R&D on a upscaler given the amount they just spent on Bungie. It would be diminishing returns for them anyway as Sony has the PS5 and PS4 that can take advantage of any upscaling technology they develop. They sure as heck won't be spending much on older consoles lol. Epic, don't think they would get into the fray with a standalone upscaler. If anything they'd rather focus on improving unreal engine. Plus FSR2.0 can be added via Plugin. I do think Microsoft could create a derivative of XeSS and FSR if only to create some sort of standard for Xbox and Windows PCs. Also they could leverage azure for ML if they wanted to add that to the mix. I do think Apple will want to get the fray. Google? They don't have a good track record when comes to products and services (long term).
  • Google probably has an upscaler buried somewhere in Stadia, not that it helps anybody. XeSS is supposed to be open source along with FSR2.0 and NVIDIA's second best offering.
    With all those algorithms out in the open, cooking up inhouse options is a lot easier so it might tempt some of the bigger players. Just a speculation. But there a lot of "not invented here" syndrome out there. Which is why I wonder about Sony. (And they're hardly broke. They started tbe year with $10B for investments and acquisitions so they still have a few bucks to spend.) They practically invented NIH syndrome. Likewise Apple, who could use it in the APPLE TV, at a minimum. Pretty much anybody with a gaming engine or SDK would benefit from an embedded software upscaler or two. Anything that frees up cycles is worth looking into so the approach is going to get used all over and not just in gaming. Specifically in CGI. I'm reminded of how BABYLON 5 saved money by using AMIGAs for their CGI. So I should probably rephrase and say anybody with image rendering software can use these kinds of upscaling algorithms. (If they're not already.) Desktop rendering is a thing, after all. This could get interesting.
  • Good points, one other candidate I think will be Valve. As they have alot of pull with studios and embedding an upscaler could be part of a Steam Deck GDK. As lowering the processing per frame would drastically improve battery life. That also enables them to branch back into living room 'consoles' should they wish. Indeed, this could get very interesting.
  • So this is a thied party software solution, standalone from specific gpu driver software? Interesting! I never could make nvidia's scaling solution so I might try this one. But is it only for upscaling 1080 to 4k?
  • No.
    It is variable.
    HD to 4K is just the most likely use case.
    And yes, it is *mostly* software based but it also can use dediated ML hardware on Intel cards, making them more useful than other GPUs. Intel isn't doing this out of kindness. 😉