How to overclock your PC monitor — why and what that means
Did you know you can overclock your PC monitor? No? Well, you can! Here's how — and why.

Overclocking your monitor is essentially for the same purpose as overclocking your processor; to get a little more performance. It involves increasing the refresh rate beyond the stock rating, meaning it can draw more frames on the screen per second.
For most people most of the time it's not exactly an essential thing to do, but that doesn't mean it's not a neat trick or that you shouldn't do it. Here's what you need to know.
What is refresh rate?
By definition, just this:
The number of times per second that an image displayed on a screen needs to be regenerated to prevent flicker when viewed by the human eye.
So simply put, a 60Hz monitor will refresh the image 60 times every second, 75Hz will do it 75 times a second and 144Hz will refresh 144 times per second. Generally speaking, especially in computing and PC gaming, higher is better. The human eye and brain see at around 24fps, but we're capable perceiving far greater refresh rates than that.
Can any monitor do it?
Potentially. Whether you can increase the refresh rate or not depends on your specific panel. Even in identical monitors, the display panels within them are not all created equal. You may read stories of significant increases for some on your particular monitor, but your own might not be so lucky.
For example, I use a BenQ RL2455HM (opens in new tab) and there are many successful reports of folks overclocking from 60Hz to 75Hz. My own monitor can only go up to 70Hz.
Update: As pointed out in the comments, take every step to check your panel specs first. Not only are not all panels created equal but some manufacturers may have applied a factory overclock already. In this instance, the risks are much higher if you attempt to push the limits even further.
It's very much one of those your mileage may vary situations.
How to overclock your monitor
It's actually a very straight forward process. You can either use a third-party tool called CRU, or attempt to use software from AMD, NVIDIA or Intel. All are free, so we'll look at each.
CRU - Custom Resolution Utility
This one is one of the older methods, and may not be compatible with all GPUs or integrated Intel graphics. It does, however, seem to work very well with AMD graphics. You can download it here and once installed it's a straight forward process to change up your refresh rate.
- Open CRU
- You'll see two boxes, one for detailed resolutions and standard resolutions.
- Under detailed resolutions click add.
- Click on timing and change it to LCD standard.
- Change refresh rate to something above the standard value, a good start is an increment of 5Hz.
- Click OK.
- Reboot your PC.
Next you'll need to change the refresh rate in Windows 10, steps which apply to any other method of overclocking.
- Right click on the desktop and select display settings.
- Click on advanced display settings.
- Scroll down and select display adapter properties.
- On the monitor tab select the desired refresh rate from the drop down box.
If it's worked, the monitor won't go black, basically. If you went too high the screen won't display anything and will revert to old settings after 15 seconds.
Using AMD Radeon settings
If you're using an AMD GPU you can achieve similar results using the AMD Radeon Settings application. Right click its icon in your taskbar and follow these steps.
- Right-click on the desktop and select AMD Radeon settings.
- Click on the display tab.
- Next to custom resolutions, click create.
- Change the refresh rate to your desired level.
- Click save.
Reboot and use the steps above next to change to your custom refresh rate.
Using NVIDIA control panel
The steps for NVIDIA users are mostly similar as for AMD, with the main difference being NVIDIA's controls look a bit more utilitarian!
- Right-click on the desktop and select NVIDIA control panel.
- Expand the display menu.
- Click change resolution and then create custom resolution.
With the NVIDIA control panel you can test your created settings before applying. Once you're happy, reboot and hit the steps above to implement.
Using Intel graphics
Intel's own graphics control panel will also let you create custom resolutions and refresh rates fairly easily.
- Open Intel HD graphics control panel.
- Select display.
- On the left click on custom resolutions.
- Enter your width, height and desired refresh rate. 5 Click add.
If your display can't go any higher, you'll be prompted and will have to either quit or try again. If you get a successful lock-in, reboot and hit the steps above to make sure you've got the new one selected.
How to verify your overclock
To test and make sure that your new refresh rate is working as it's supposed to, there's a great online test that you can run. Visit http://www.testufo.com/#test=frameskipping in your browser and follow the steps on screen. It will recognize the refresh rate you've got selected for your monitor at that time.
What you basically do is take a photo of the moving graphic with a lower shutter speed and if everything is working as it should you'll get a (poor quality) photo like the one above. If the shaded boxes are in a line and unbroken, then you've been successful. If the boxes are separated, then you're getting skipped frames.
Not bullet proof and at your own risk
Just like overclocking a processor, all of this is done at your own risk. You should be able to experiment without blowing up your PC monitor, but there are never any guarantees . So be careful with your gear. Also bear in mind that the connection to your monitor from the PC could have an effect, as does the resolution. You might be able to overclock at 720p, for example, but not at 1080p.
And as already mentioned none of this is an exact science and your mileage may vary. It's hard to say we'd recommend this as an essential thing to try, but if you like to tinker there's nothing stopping you.
If you're an old hand at doing this, be sure to drop any tips or tricks into the comments below!
Windows Central Newsletter
Get the best of Windows Central in your inbox, every day!
Richard Devine is a Managing Editor at Windows Central with over a decade of experience. A former Project Manager and long-term tech addict, he joined Mobile Nations in 2011 and has been found on Android Central and iMore as well as Windows Central. Currently, you'll find him steering the site's coverage of all manner of PC hardware and reviews. Find him on Mastodon at mstdn.social/@richdevine
-
I'm using a Tv as my monitor. Can this still be done?
-
Yes but many TVs have poor refresh rates compare to computer monitors. I usually only do this when there is an issue with the monitor. I used to have a 3 monitor vga splitter and the refresh rate and resolution was not right. So I changed them manually and it worked much better. However, the splitter died after 6 months, and I can't say that was the reason but I would not do this to a pricey monitor.
-
Mine is capable of 240Hz in Game mode. Games are smooth and awesome at 55".
-
No, It's not. Max it will do is 165hz and probably only 60hz if it's a telly. Anything higher is just interpolated.
-
First CHECK YOUR PANEL SPECS! Many monitors that run at 144hz in fact use 120hz panels which are factory overclocked. I would be very careful in adding a further user overclock on top. Check your panel specs, not your monitor specs. This should REALLY be mentioned in the article. Please change to avoid people causing damage.
-
Good point well made. Even though we said clear enough "at your own risk" :)
-
You know he is right and "at your own risk" means nothing unless you can explain the risk and how to avoid it. AndyCalling's one comment explained it fine.
-
dont do it 1. not worth it 2. in most cases the monitor wil shut down itself or the electronic part will blow up
-
This was so much easier in Windows 7. on my age old CRT screen I could actually boost the resolution above it's supposed maximum of 1600 x 1200 by unchecking "hide modes this monitor cannot do".
-
Overclocking your monitor won't increase the resolution. All that will happen is it will be downscaled back to the maximum resolution of the screen. In effect its just a type of antialiasing or oversampling.
-
True for "modern" monitors where the resoution is fixed, but not the case on old-school CRTs, which, in certain regards, are still better than the best new monitors.
-
Yep, and it was great too because that slight bump meant I could get two word pages on screen at once (it was a weird res, I think 1788x1341 from memory, or something like that) which was great for proof reading my school assignments. No games seemed to support the res though so they were all in 1600x1200. The screen could also handle the res at 60Hz too. 21" Sony Cybertron CRT, I miss that monitor.
-
I dont miss the back of a CRT montor... A 21" PC CRT was like 18-20" deep....such a monstor when you look at modern monitors...
-
Waste of time and very likely to blow the screen just pay the extra for a better screen in the first place it's cheaper than having to replace a blown one!
-
But if I blow up my monitor then that's a great excuse to buy a knew higher res screen!!!! Haha.
-
@Zeroplanetz, I think that when I tinker, subconsciously, I'm trying to break something so I have a "valid" excuse to upgrade. When I was little kid, some time when my mother wasn't looking, I poured a glass of apple juice down the back of our old TV that took 30 seconds for the picture to appear and another 5 minutes to warm up to the point where it filled the screen (must have been one of the first color screens). Quick blue flash and that was that. "Oops. Sorry, mom. Guess we need a new TV. Did you know they now come in stereo?" I think I now do that to myself with computer components.
-
Haha that's awesome. But I can't honestly say I've blown up any of my pc stuff yet or a TV from tinkering.
-
>But if I blow up my monitor then that's a great excuse to buy a knew higher res screen!!!! Haha. Or you live in a case line mine....my wife who believes in "If it aint broken, you cant replace it"... So find a way to "break it !"
-
I am done that approx 2 years before when i am use crt monitor but if u increase refresh rate u cant increase resulation. If u done then ur screen fliker for some time approx 15-30 second and then goes to normal resolution. Btw i am curious can increasing refresh rate on laptop screen.
-
Wow, I haven't thought of doing this since switching from a CRT to a LCD. I just tried it on my Philips 200P, connected via VGA, changing from the default 1600x1200x60Hz to 75Hz. I checked an old tech spec which said it could handle it. It worked, somewhat. It still displayed but was blurry. So I let it do the 15 second timeout back to the old setting, and will stick on 60Hz. Maybe someone will find something useful in doing this. And knowing that it is possible, I'll probably try it again. But I don't see much use out of it.
-
I'm still very happy with my 1440p Korean monitor i bought a couple of years ago for ridiculously little money. It has a Samsung panel and it overclocks up to 120Hz@1440p. It only loses some brightness when pushing above 90Hz so I usually compromise and keep it at that frequency. Thanks to dual dvi input lag is really low too.
-
"The human eye and brain only see around 24fps" The human eye and brain do not see in fps. This is a myth that has been debunked many times. 24fps was chosen as the standard for movies as the minimum fps that they could get away with and this was done to save cost in film. The max our brain can process is still unknown. However many combat pilots have been tested to see how quickly they can identify a target and some were able to identify a target (make and model) at even 1/240th of a second when flashed on a screen.
-
around 24fps is when the human eye starts to perceive fluid(ish) motion
-
I can't believe I saw such bullshit claim at site like this.
-
There are a few programs that can overclock your video card... I remember on my last system (about 6 months ago), I was having a little issue with frame rate on one game, when a lot of bad guys were on the screen, it would slow down a hair (not major but, noticeable). I used one of these 3rd party programs and overclocked it by 10% and it went through the game with not even a single slowdown...
-
Check your facts, Richard. Human eye sees more than 24 frames per second.
-
True, but Richard is not really wrong either: the main reason that higher than 60fps looks better is because of tearing and perception of jittered motion. That is, we don't actually see more images than 25-35 images in a second, but our peripheral vision detects motion and is sensitive to the big frame deltas at up to about 75-100 fps. Above that, anything we see is effectively a strobe effect or a result of slightly-stuttering graphics. For example, if your graphics card is pushing a variable frame rate between 80 and 140 fps and your panel running at 120 Hz, if the frame rate is not locked between them, you can get mismatched frames that yield an effectively MUCH lower frame rate for a few frames. You can't see exactly what happened, but your brain knows it didn't look good. The higher the refresh rate on the monitor the higher that least common denominator and so the better those low points look. Of course, the better solution are the newer monitors and graphics cards that support G-Sync and FreeSync technologies to lock the frame rates between monitor and graphics card, completely eliminating this problem. Back in the CRT days, you really wanted them to be running at least at 75Hz, because below that people with good eyes could get headaches from actually seeing the flicker of the screen redrawing, especially out of the corner of your eye (peripheral vision is more sensitive to motion than the center of your field of view, presumably evolutionarily to pull your eye to the predator or prey moving at the corner of your perception). But LCD and LED monitors are persistent (the image doesn't fade between drawings), so that particular problem is no longer an issue.
-
Well, I am yet to see a valid investigation that backs up your claims. We keep our screens in our central field of view and yet the difference between 24fps and 48fps is so damn obvious, so I don't think you can throw in peripheral vision argument here. 24fps uses extensive motion blur trickery to be a comfortable minimum, because filmmaking is expensive as hell. Because frame rendering is expensive resource wise for both CGI and videogames.
-
@Nekroido, what monitor do you have that goes that low? Standard is 60Hz. Some of the early 4k monitors over older HDMI would need to drop down to 30Hz to accommodate bandwidth limitations of the source video. But even 1 Hz would be fine for a fully static image on an LCD -- it's only motion that reveals the stuttering of too low a frame rate. I think we may be talking about different things...
-
My monitor is 120Hz, thank you very much. And you are grasping to straws, buddy, because I'm saying FPS, read FRAMES PER SECOND. I believe it is you who is talking about different things in one single conversation just to make a point. Uncool.
-
@Nekroido, I think you read a tone in my response that I didn't intend. I meant it literally, no sarcasm intended. Not trying to offend or even argue. We're all tech friends here. Yeah 120Hz is much more common for a modern monitor. I was curious because you said people didn't like 24 fps. Unless you're sitting in a movie theater (where film projectors do run at 24 fps), I don't believe you will find a monitor running that low. The article is talking about boosting the frequency your monitor runs (Hz) at to increase the number of frames per second it can handle. A 60Hz monitor can display up to 60 frames per second. If your monitors are running at 120 Hz, then the only time the number of FPS you get that's below that would be if the computer/graphics card isn't able to output that. That's a very common problem -- most pepole focus on increasing the graphics power of their computer, rather than the monitor, because above 60 Hz, the monitor's capability is not purely visible, except (as I noted earlier) in certain situations or to someone with really good eyes' perihpheral vision. But when a game is running at, say 100fps, the main reason it looks better than a game running at 60fps, is not because of more frames in every second (you'd be hard pressed to even distingish side by side), but because that's the average and there are often brief periods where it drops lower. When those drop down to 15-20fps, that is VERY visible, espcially if there is rapid motion. For just reading a Word document, you'd probably not notice. So, Richard isn't wrong in the specific words he used, but you're also right that it is good to have a faster refresh rate to support a higher overall FPS.
-
> movie theater (where film projectors do run at 24 fps) They are NOT. You are really uneducated about the topic, so please stop, because I refuse to read this malarkey any further.
-
I stopped reading at human eye reading at 24fps... SMH
-
Me too.
-
@PeterFnet and @Terepin, it depends what you're measuring. If you go to a movie theater, movies run at 24fps and people don't generally complain about the low frame rate at the cinema. The main reason we think of 24fps as being miserably low in gaming is because it's a symptom of the game stuttering and because if the monitor is running at, say 60Hz, the least common timing denominator can be even less than 24 without G-Sync or FreeSync to keep them locked together (30 fps on a 60Hz monitor means each frame from the computer gets exactly 2 cycles on the monitor, but at 24 some get 2 and some get 3, which is visibly shaky). But gaming FPS is also an average over a piece of a second, and that means there are fractions of seconds that are doing much worse. That's what you really think of as the problem. So yeah, we definitely want more than 24fps in video output from our computers, but Richard is not wrong that the human eye doesn't seem much more than that. It also depends some on the person (younger people and better eyes can see more) and where on your eye you're seeing it -- center of your eye is filled with cones which are excellent at resolving detail and color, but poor at seeing fast motion; your peripheral vision is mostly rods which are excellient at seeing motion and are more light sensitive, but poor at seeing color or detail (also why things go gray in the low light at night).
-
this is intresting. :D
-
You can apply all these settings but, most monotors wiill give an error "Source out of range" or something of the sort, the problem can be if you saved it, you need to reboot in safe mode or something to get your display back.