Radeon will support next-gen HDR displays in 2016

AMD R7 265
AMD R7 265 (Image credit: Windows Central)

Radeon has announced that High Dynamic Range (HDR) displays will be fully supported in 2016 by select graphical processor cards already available, as well as those released next year. AMD is banking on the technology to really ramp up the visual experience for gamers.

HDR displays will be the next big advancement, explained AMD employees in a recently released video, touching on how they plan to improve the quality and standard of display devices. It's stated that current displays can only achieve a fraction of the luminance which a human eye can perceive.

It's not all about brightness, however. With HDR comes a new color standard, Rec2020, which will cover 75.8% of the human chromaticity diagram, as opposed to current SRGB's 35.9%. This will dramatically increase the total number of colors produced by screens that the human eye can see.

AMD expects that said displays will hit the market in 2016 and the company is making sure its GPUs will be supporting more intense gaming and movies. As well as future cards, there are a select number of options in the R9 300 series that are HDR compatible.

Source: Twitter

Rich Edmonds
Senior Editor, PC Build

Rich Edmonds is Senior Editor of PC hardware at Windows Central, covering everything related to PC components and NAS. He's been involved in technology for more than a decade and knows a thing or two about the magic inside a PC chassis. You can follow him over on Twitter at @RichEdmonds.

32 Comments
  • Time to throw out my old TV
  • I don't really see what's wrong with the tv/monitor I have. If I'm watching a movie skin tones look accurate, the sky looks accurate, and so does everything else.
  • We'll find out. People didn't necessarily see an issue with CRT televisions. I survive just fine on a 2008 720p TV, but I can still tell when I look at a good 1080p display that I'm due for an upgrade. This tech could also be something that REALLY shows its benefits on monitors, when you're much closer to the screen. Maybe TVs use something similar already. The only way to find out is have the stuff released.
  • Heh, my parents with their 4k Smart TV, 6 HDMI ports, no apps installed, DSL WiFi, connected to a cable box and watching mostly SD channels stretched to fit.
  • It was different with CRT because the colours and contrast on CRT actually was a **** of a lot better than LCD and Plasma, it's only OLED and IPS displays that are significantly better. I imagine it will be better, I guess it'll kind of be like listening to uncompressed audio through decent headphones, you don't realise what's missing until its there.
  • The reason you don't feel like you're missing much is because part of mastering photos and films is taking into account the fact that you're limited in luminance.   A lot of times when I'm grading a shot for television I'm taking the top half of the luminance and just rolling it off/smushing it together.   So the highlight ping off of a car windshield is "white" and the highlight on someone's face is also "white" and everything in between those two is just "white" as well.   However in the real world you can effortlessly tell the difference between the glint on a car windshield which can be blinding and the highlight on someone's cheek which is just "white".     When you get TVs up to 500, 600 + nits then that ping off of the car windshield is noticeably brighterwhile the face and skin tones still look good.  But yes, just the new modern LCDs which can at least reproduce REC709 well are already a huge improvement over the early ****** LCDs. 
  • But if the color space changes don't everything else have to chance? Or just software converted from RGB to HDR?
  • Itd be nice to list what cards will or do support it as of now.
  • "As well as future cards, there are a select number of options in the R9 300 series that are HDR compatible." That's what they've said right now.
  • I just bought the R9 390x. I suppose it will be supported.
  • Why is amd cheaper than Intel?
  • Because intel want more profit
  • Because Intel produce better products.
  • Better GPUs? Yeah ******* right hahaha Posted via the Windows Central App for Android
  • Well they usually do 2 updates a year to CPU arch. 1 in spring/summer & a revision in the fall
  • Even when AMD was the king of the hill they were cheaper because they were more into volume dealings, and generally ran a tighter ship than Intel did... These days AMD is far from the best parts maker out there (aside from their ATI graphics division that they purchased which still puts out decent products), and even at a discount they can't sell product. Heck, except for the cheapest of processors (which won't do much anyways) Intel is often CHEAPER than AMD on a $/! basis. It is really quite frustrating.
  • You mean Nvidea right? I never look for AMD carts simply because of their extreme power requirements
  • Intel doesn't produce GPUs. AMD CPUs are cheaper than Intel because they are old and slow. AMD hasn't released a true desktop processor off of a new architecture iteration since 2012 or 2013. They skipped it for their last 2 hardware revisions, only pushing HTPC/budget stuff on Piledriver and Excavator.
  • Among other things, Intel had better power usage, better thermals, and overall better performance over power consumption and heat radiation. Basically Intel is more environmentally friendly in the grand scheme of things. Visit anandtech.com and read an article comparing the two companies. There are plenty there, and most give the same message, though in recent times, AMD has improved. Hasn't quite caught up to Intel though.
  • "As well as future cards, there are a select number of options in the R9 300 series that are HDR compatible." --I take it this also includes the high-end Fury stuff, which came out with the 300 series, but isn't branded as part of it, right?
  • Assuming so.
  • Who knew my being color blind could save me money! :)
  • My TV is just fine. My Panasonic Viera may be a 2007-2009 TV, but it has 1080p.
  • Resolution, of course, is not the only thing in the world.
  • Xbox one support?
  • Almost definitely not.
  • Just another way for TV manufacturers to sell more TVs that we don't need. Like 3D and 4k and curved tv. Gimme a 1080p oled flatscreen tv and Il be set for the next 10 years.
  • I have a 28" 1200p monitor that would beg to disagree!  1080p looks plenty crisp until you hit ~20-24" at a normal viewing distance.  After that 1080p is fine for movie watching and games (anything with motion), but it gets hard on the eyes for still images and text.
  • This is pretty exciting!  I am looking to replace my current monitor with a TV with a wide color gamut (not quite rec2020), but the issue has been getting Windows to support wide color, and cards to support it... at least this will knock out one of the two! Also, quick note: rec2020 is supposed to be slightly beyond the scope of what the human eye is capable of seeing.  More likely they are doing one of the other 'next gen' standards that are not quite there, but are supported by 4K blurays and much easier to process.
  • I'm still using my 2007 Hyundai 21inch CRTV.
  • While I welcome all improvements on image quality, are we still going to do computing on what is essentially a super laggy experience with a 60hz input and output? In contrast, a 120hz refresh rate is such a flawless, smooth experience, once you see it, you'll never go back. I'd love to see 120hz become the new standard on ALL panels and display manufacturers go crazy on the marketing.
  • Actually alot of of 4k TVs can do that in 1080 mode just maybe not the response time desired