Skip to main content

300Hz gaming-laptop displays are coming ... but we're not sure why

This year at IFA 2019, companies like Acer (Predator Triton 500) and ASUS (ROG Zephyrus S GX701) are pushing new displays that can hit 300Hz. That sounds crazy, especially since we just started getting laptops with 240Hz screens – something I called dubious in my recent Razer Blade 15 Advanced review.

Wait ... 300Hz? But why?

This HP OMEN Obelisk will crush your puny gaming laptop and it still can't hit 200 FPS.

This HP OMEN Obelisk will crush your puny gaming laptop and it still can't hit 200 FPS.

But for the life of me, I'm not yet getting why a 300Hz display is necessarily a good thing. Sure, hitting 250 frames per second (FPS) or higher in your favorite game sounds incredible, but you need a GPU powerful enough to make that happen. Unfortunately, we don't have that technology today (and won't anytime soon).

For example, my current home rig is an HP OMEN Obelisk desktop (opens in new tab). It's a massive beast with an Intel Core i9-9900K, liquid cooling, an NVIDIA RTX 2080Ti, 64GB HyperX Fury DDR4-2666, and a RAID 0 PCIe NVMe setup for storage. It's also paired with HP's new 240Hz display (opens in new tab). Guess what? In most first-person shooters I can't get it over 160 FPS. Even turning down graphics quality to low doesn't approach 200 FPS, let alone 240 FPS.

That's all on a $3,000 desktop PC with basically the highest-end hardware you can get in 2019. What's a laptop got that will make a 300Hz display worth it? Is there any evidence that you can even perceive a difference in 240Hz versus 300Hz even if it were possible to game at the level? I'm unconvinced.

Companies like Razer are offering a 240Hz displays at just $100 more – and that includes a processor bump with it, making the actual cost nil. So really there's no downside in getting a fast display. But at the same time, it all seems gimmicky. The idea that these laptops are positioned towards esports is also doubtful. While I'm sure some pro-gamers use laptops, it doesn't seem like a widespread practice, nor very practical.

So where'd this even come from?

Razer Blade 15 Mercury White

Razer Blade 15 Advanced with a 240Hz full HD panel can't really game at 200+ FPS. (Image credit: Windows Central)

How this all came to be though is something I can explain. Companies like ASUS and Acer don't make displays; they buy them from those who do, like Samsung, Sharp, or LG (to name the big ones). Those display companies are competing for bulk orders, so they must come up with something "hot." In late 2019, it's 300Hz displays. It's the same reason why "suddenly" everyone has 15-inch 4K OLED screens this year – Samsung made a boatload, and laptop companies bought them to have something new. (It's also the same reason why there are no 13-inch OLED displays yet. Samsung just hasn't made any.)

If users don't have to pay a premium for a 300Hz display, it's hard to get mad at the technology. It's just a cool thing to have with rarely any downsides. But go into buying your next 300Hz-display laptop with some reality on your side. You won't be gaming anywhere near 200 FPS, let alone 300, which makes 300Hz just another fancy sticker.

Don't fall for the hype.

Daniel Rubino
Daniel Rubino

Daniel Rubino is the Executive Editor of Windows Central, head reviewer, podcast co-host, and analyst. He has been covering Microsoft here since 2007, back when this site was called WMExperts (and later Windows Phone Central). His interests include Windows, Microsoft Surface, laptops, next-gen computing, and arguing with people on the internet.

25 Comments
  • I would think trying to hit 300 Hz does have a downside with battery life. You see a downside in 4k with battery life, but with 4k, I think the trade off is worth it. The movies set their frame rate at 24 fps. Going to 300 Hz might have a benefit of accurately repeating the 24 frames more evenly across the 300 Hz to prevent judder affects. By the way, your gaming computer is completely sick! That is basically a super-computer.
  • Good point about battery life. It does have some effect, though not as huge as I thought. But on gaming laptops, battery life is rarely good as is so every bit counts. And thanks, that HP OMEN is a ton of fun to use. It'll see a lot of Gears 5 action in the coming weeks.
  • I want your job, enough cash to buy that rig and enough time to use it... So far I haven't found a way to reconcile both 😂
  • Yeah, I don't feel so bad about using adblockers anymore, having just read that. :)
  • You know what just as stupid. Samsung's 8K Qled and Galaxy foldable phone that's been cancelled.. It's stupid for companies to take a huge plunge on technology just to be first, when the latest thing is still in its infancy. 300 Hz... Yea, why....
  • Some things do not provide any benefit. 8k is not perceptively different from 4k on a small display, for instance, as the pixels are already imperceptibly small on the 4k display and the human eye doesn't have the resolution to tell the difference. It would only be downside to do it on a phone or laptop, as you'd impact battery life and increase processing power requirements. It does make sense on very large displays though, especially if you're going to get close to them. So it has a purpose at least. And you'll note, the only Samsung 8k displays are very large. 300hz is one of those things. There are other aspects of the panels that are maybe worth it, but there won't be any appreciable difference between a 240hz or 300hz display as far as the frame rate is concerned. The human eye and the visual cortex cannot process information anywhere near that quickly. But the Galaxy Fold is not one of those things. Someone has to go first. The only way you move something out of infancy is by making stuff and learning from it and improving the processes.
  • Hello, We have actual hands-on experience with 480Hz -- it makes browser scrolling 4x clearer than 120Hz, and 8x clearer than 60Hz. There are more noticeable benefits of 240Hz vs 480Hz than 4K versus 8K. Here's some photographic proof: Hello, We have actual hands-on experience with 480Hz -- it makes browser scrolling 4x clearer than 120Hz, and 8x clearer than 60Hz. There are more noticeable benefits of 240Hz vs 480Hz than 4K versus 8K. Here's some photographic proof: Hello, We have actual hands-on experience with 480Hz -- it makes browser scrolling 4x clearer than 120Hz, and 8x clearer than 60Hz. There are more noticeable benefits of 240Hz vs 480Hz than 4K versus 8K. Here's some photographic proof:Hello, We have actual hands-on experience with 480Hz -- it makes browser scrolling 4x clearer than 120Hz, and 8x clearer than 60Hz. There are more noticeable benefits of 240Hz vs 480Hz than 4K versus 8K. Here's some photographic proof: Hello, We have actual hands-on experience with 480Hz -- it makes browser scrolling 4x clearer than 120Hz, and 8x clearer than 60Hz. There are more noticeable benefits of 240Hz vs 480Hz than 4K versus 8K. Here's some photographic proof: https://i.imgur.com/PxvWkcK.jpg
  • Please excuse me for duplicates -- mobile device malfunction. Apologies.
    Unfortunately, it will no longer let me edit the duplicates out of the above.
  • We have actual hands-on experience with 480Hz -- in non-gaming contexts too -- it makes browser scrolling 4x clearer than 120Hz, and 8x clearer than 60Hz. There are more noticeable benefits of 240Hz vs 480Hz than 4K versus 8K. Google "480Hz" for some photographic proof of high refresh rate benefits. It does a disservice to the industry to discourage high-Hz, as virtual reality scientists have determined the vanishing point of diminishing curve disappears far beyond 1000Hz. The human eye can't see the refresh rate directly, but there are other benefits like reduce motion blur as well as reduced stroboscopic effects -- though you have to double refresh rate to see significant improvements -- e.g. 60Hz -> 120Hz -> 240Hz -> 480Hz -> 1000Hz
  • Galaxy Fold wasn't canceled. It is being released today actually.
  • While I barely see almost no improvement past 60Hz.
  • You start running into the limitations of the human visual system at around 90hz. Anything over that is really more about providing stability, making sure you don't dip down into something noticeable, than providing any intrinsic benefit.
  • The diminishing curve definitely exists but the vanishing point of diminishing curve of returns do not disappear till far beyond 1000 Hz -- in fact, a lighting study showed some humans could tell PWM stroboscopic effects up to roughly 10,000 Hz. That's whthey recommended electronic ballasts for fluorescent fixtures go to 20,000 Hz to go practically 100% of population not seeing side effects. Now, it was also discovered that the same problem happens to displays; the refresh rate produces a bottleneck that is impossible to simultaneously solve motion blur & stroboscopics at the same itme, (A) If you keep refresh rate low, but strobe instead, you have flicker and/or stroboscopic/phantomarray effects like this pictured: https://i.imgur.com/PxvWkcK.jpg This can be full-screen for games, e.g. staring at crosshairs and seeing the world step-step past, rather than a continual analog blur. You can test it yourself at www.testufo.com/mousearrow .... (B) If you add artificial GPU motion blur effects to the frame rate, it solves the wagonwheel / phantomarray effects, but now you've got extra motion blur above-and-beyond human vision limits The motion blur still exists even if you turn on motion blur reduction (e.g. LightBoost, ULMB, etc). It is not possible to solve (A) and (B) simultaneously without dramatically raising frame rate and refresh rate. That is a big area of research nowadays. Look at how 1080p and 4K is no more expensive than 1024x768. Tomorrow, true-480Hz (non faked) may be a cheap add-on in say, 20 or 50 years from now, and one can also appreciate better-looking browser scrolling (8x sharper than 60Hz) and better movements (mouse arrow, panning, dragging, turning, playing, etc) in the smoother with less steppy-effects. And heck, you can software-adjust the blur even, if you preferred blur (since adding blur is easier than removing blur without degrading quality). Why dismiss future progress completely? Also a strobe backlight mode (e.g. LightBoost, ULMB) has some major disadvantages in that they add lag, add flicker, add stroboscopics, reduce contrast ratio, degrade color quality, amplify visibility of microstutter (lack of motion blur makes microstutters easier to see on a fixed-Hz display). Even 479fps at 480Hz still produce human-visible stutters in test -- for example, 2000 pixels/second is about 4 pixels per frame at 480fps, and one framedrop causes an 8 pixel jump. It still shows up as a minor tic. VRR (FreeSync) would solve that as that is good at eliminating framedrop stutter on FreeSync/GSYNC as frames are not delayed a "full refresh cycle" on variable refresh displays. But, one magical thing we've found is that ultrahigh Hz actually makes VRR obsolete. The framdrop granularity is so tiny (1ms) that any framerate on an ultra-high-Hz display looks like a variable refresh rate display. And the bonus is that ultrahigh framerates (e.g. 1000fps) looks just like ULMB, except without strobing. Even 0ms GtG OLED has motion blur, because fast GtG doesn't mean fast MPRT. GtG and MPRT are not the same thing -- OLEDs are always slow MPRT response unless strobed, even if they are fast GtG pixel response. That's why all VR OLED displays have to strobe (flicker) in order to eliminate OLED motion blur. Strobless ULMB is possible with ultrahigh Hz, assuming pixel response is removed as a bottleneck. ULMB flashes refresh cycles for 1ms, but you can produce the exactly same zero-blur (CRT clarity effect) with a thousand unique 1ms frames with no black periods in between; so "strobeless ULMB" (aka 1000fps at 1000Hz) looks absolutely magical. You can get essentially zero blur, full brightness, full color, full HDR, no flicker, far less stroboscopic effects -- basically blurless sample-and-hold -- CRT-clarity without the use of any impulsing technology. So, to many of us, ultrahigh fps at ultrahigh Hz is kind of a big holy grail in display technology. There is finally several technolgical paths (including GPU solutions) that have finally emerged, and the refresh rate race is a slow one (e.g. Hz doubling every 10 years) While not all use cases warrant 1000 Hz, the bottom line is human visible benefits exists. Just like 4K and 8K was five-figure or six-figure expensive, and now 4K is a few hundred dollars at Walmart, there comes an era where true genuine (non-faked) ultra high Hz can be a cheap incremental add on. We retina'd resolution, but we are still far from retina refresh rates, and eventually retina refresh rates may someday be cheap this century -- the technolgical path is now lab-tested to show benefits and will progress. Dismissing 1000 Hz is just like dismissing yesterday's 30fps-vs-60fps without understanding the details. The gaming sites, VR sites, etc. all know elements of this and don't dimiss ultra-Hz anymore, but writers of mainstream sites (like these) understandably sometimes still need to catch up on recent researcher (post 2015-dated research) material. Many glance at a display at a convention and don't see the appropriate demonstrations showing actual benefits like see-for-yourself at the 30 different selectable tests at top of TestUFO, and what real-world situations they apply to (e.g. browser scrolling, or heavy-foilage action in game, or fast RTS panning, etc). My goal is that within 5 years, no writer writes "THIS IS DUMB" taglines (like what the author wrote above) for topics like these, without a single paragraph to the scientific research of the actual scientific benefits of ultra-Hz to let readers judge for themselves. It's better to be nuanced like instead of "300Hz is worthless", but nuanced "Pushing 300 Hz in a laptop may have limited benefits for majority, even though researchers have confirmed diminishing curve disappears far beyond 1000Hz given appropriate variables, as seen in [link] and [link]" or such balanced statements that respects science rather than creating readers to assume that the ultra-Hz benefits are non existent. It's like saying 4K is completely useless in 1990, but today 4K is mass-market and we recognize more benefits than expected (supercrisp text, retina displays, nice Netflix video, etc) once we are able to afford it. Reputable writers at well regarded popular sites need to avoid spreading fake science as much as possible in today's world, and while it is sometimes a losing battle, Blur Busters plays a very important (and successful) advocacy role in some quarters as bleeding edge early birds but with actual scientific laboratory experience.
  • There are other advantages to the higher hz displays, such as anti ghosting and pixel response etc. As well as reduced latency. If it can pump out a frame that much sooner its all an advantage at the end of the day. Especially in games like CS:Go. Dont forget too that higher refresh rates also help reduce things like screen tearing for non adaptive sync screens, or for those that prefer not to use it. Just saying there's a lot more innovation that comes with these display than just the frame rate. Plus eventually GPU's will catch up and you'll likely own a display longer than a GPU. I understand this doesn't apply to laptops, but laptops generally do come first with displays. Laptops were around long before people started moving from crt's.
  • But the point is the human eye and brain simply cannot process visual information at that speed. Like I said in my standalone comment, I can understand the need for input response to be as fast as possible from keyboards, mice etc. but when it comes to what you see on the screen itself, I doubt anyone but a miniscule percentage of people would see even the slightest difference above 120Hz. It's vaporware, technology that has no meaningful purpose to exist.
  • It helps to prevent against tearing, with a higher refresh rate the screen is more likely to match up each frame of movement with a moment when the screen refreshed.
  • Actually, Lee, a human eye does not need to process information that fast. There are side-effects that are generated such as phantom array effects and wagonwheel effects. Photographic proof of how easy it is to tell 240Hz vs 480Hz apart when you pay attention to mtion:
    https://i.imgur.com/PxvWkcK.jpg We have readers who get painful effects from things like DLP color rainbows (~360 Hz to ~1440 Hz), and we have readers who get motion blur eyestrain. Tests of true ultra-high Hz show really excellent ergonomic benefits -- kind of like a blurless flickerless strobeless full-brightness CRT HDR. Motion blur reduction on LCDs currently requires flicker (to emulate a CRT) but solving display motion blur without the use of any flicker/impulsing/strobing at all requires emulating analog motion. The only way to emulate analog motion accurately is to avoid using the humankind invention of using a series of static images to emulate moving images. But since analog-motion (framerateless displays) are impossible, the closest we can get is using ultrahigh frame rates at ultrahigh refresh rates, and it is discovered it is an excellent fascimile of analog motion -- assuming other weak links are removed. Many VR scientists, people in big companies such as NVIDIA, and other parties are working on ultrahigh Hz. Although it's so early, it's like talking about 4K in year 1990 when everyone is only doing 120Hz or 240Hz -- many researchers are now finally at the stage of actual show-and-tell (actual laboratory ultrahigh Hz displays) "stop people from laughing" by writing research papers, science, and studies, and understanding the weak links, etc. For more reading, see https://www.blurbusters.com/1000hz-journey
  • Why not use the newer plasma tech that Samsung and Phillips or Toshiba had, that was 600hz for TV's and better color. Technology has obviously advanced in the 10 years it's been gone, it can be amazing now!
  • Would be awesome for 3D back in the day
  • The 600 Hz was the field rate (essentially temporal dithered dots -- that "plasma noise") -- not full 24-bit color per Hz. So back in those days were exaggerated Hz days. Today, the ASUS laptop is 300 genuine Hz, although my opinion is that it will need faster than 3ms pixel response time to fully milk the benefits of 300 Hz. This will be technological progress to come, as now exists 1ms IPS and 0.3ms TN which are currently on the market. Though pixel response measurements are a bit of a nebulous matter since many readers blame manufactures of exaggerating. The VESA GtG measurement is from the 10% to 90% transition (e.g. for a black to white transition, that's the stopwatched transition from very dark grey to very light gray) -- as explained at https://www.blurbusters.com/gtg-vs-mprt Tomorrow, true 600 Hz will be vastly superior to the yesterday's 600 fake Hz -- since it's full 24-bit refresh cycles per Hz on an LCD. The problem with getting LCDs to 600 Hz is pixel response will need to be roughly ~0.5ms pretty uniformly on the entire GtG matrix (all color transition combos) to prevent diminishing benefits of 600 Hz. Motion blur follows "1 pixel of motion blur per 1000 pixels/second". Mathematically, 240 Hz still creates about 4 pixels of motion blur per 1000 pixels/sec. (1/240sec = 4.2ms of motion blur for a sample-and-hold display). As displays get higher resolution and closer to retina, getting CRT clarity gets harder and harder without going to unobtainium frame rates and refresh rates. This kind of guarantees a long progress to retina refresh rates this decade. Either way, understanding how pixel response bottlenecks a refresh rate, is something of an education matter.
  • Is there any physical, scientific, actual reason for displays to be any higher than 120Hz? Keyboard/mouse/input response I can understand but not display. It really does seem like this is a total gimmick with no real world benefits whatsoever... ...cue that 1 pro-g4m0r who will post saying he can see things in 600Hz.
  • So you can understand a keyboard.... but you can't understand how the thing you actually look at might need a high refresh rate? 120 isn't high enough. 240 is getting closer. 300 is truly perfect and the human eye at that point wouldn't be able to tell beyond this.
  • A great explanation of the visual benefits beyond 120Hz can be explained in an article titled "Blur Busters Law: The Amazing Journey To Future 1000Hz Displays"
    https://www.blurbusters.com/1000hz-journey There's also photographic proof of human-visible differences in tests, though you do have to get pixel response under control (tiny fraction of a refresh cycle) and keep doubling refresh rates to get quite noticeable visual benefits. While flicker is gone beyond 120Hz, there are indirect effects such as
    -- reduced stroboscopic effects, wagonwheel effects
    -- reduced motion blur without the need for strobing
    (double framerate & Hz = half the motion blur on flickerless displays). VR scientists including those at NVIDIA have confirmed that the vanishing point of diminishing curve of returns disappear far beyond 1000 Hz.
  • "Wait ... 300Hz? But why?" Because we can!
  • I ask why not?