Do you sacrifice PC game enjoyment to chase for higher frame rates?

There's a general consensus within the gaming community (particular for the PC) that 60 frames-per-second FPS is what one should aim for as an absolute minimum for a stable and enjoyable experience. Using tools like Fraps can help gamers fine-tune their hardware and software installations to achieve such a result in benchmarking and test runs in games. But while ensuring you have a stable rate of frames being sent to your connected display(s), do we risk forgetting about why we play games in order to chase perfection?

Hardware ain't cheap

We feel consumers with the latest iteration of processors (CPU) and dedicated graphics processing units (GPU) as both are super expensive, especially if you're seeking to game comfortable at a resolution of 1440p or 3840p (4K). A GTX 1080 (or two) would be able to handle such resolutions with high to maximum settings, but you'll need to shell out a good few hundred for the card alone. Then you need to throw in a CPU that won't be a bottleneck, which would be another few hundred.

GTX 1070

ZOTAC GTX 1070 AMP EXTREME (Image credit: Rich Edmonds / Windows Central)

The end cost quickly escalates and it's easy to find one's budget eaten up solely by both components, and that's before we've even considered a solid motherboard, 16GB/32GB RAM, and SSDs. Investing such an amount in hardware immediately makes any PC owner almost demand maximum performance, which is often promised by companies in marketing and the slew of performance numbers shared. It's therefore understandable that you'd want to keep glancing at an FPS counter to check for dips.

Chasing the dream

The Witcher 3

When looking at the counter, however, are we simply at risk of becoming obsessed with obtaining a solid number of frames being rendered each and every second? Until we're at a point where computers are generally so powerful (even mid-rang builds) that they're able to output high-quality content in 4K at 60 FPS or higher, there are going to be those who continuously glance at this number to see if there's a need for overclocking settings to be increased.

Jumping up and down between 30 and 60 FPS would be noticeable for many, including me. That said, in titles like The Witcher 3 (opens in new tab) I never once felt ashamed when my PC fell below the glorious 60 FPS marker. The game looks so gorgeous and is that good I'm too lost in the fantasy world to even care. The same goes for something like Euro Truck Simulator 2, which is an excellent game (no, seriously), whereby frames are lost in another realm when playing online with thousands of others. Again, it doesn't bother me since it's incredibly demanding.

So, I'll admit that I try and chase for 60 FPS performance levels, but if I'm already taxing my system to the degree that I don't wish to go any further, I'll turn off the counter and just enjoy what I'm currently playing. While I agree that the higher this number in particular, the better the overall visual experience (even if the human eye is unable to fully tell the difference), I would turn down settings slightly if I'm having slight stuttering and call it a day.

Are you part of the hunt?

I'm intrigued to learn what level of performance you chase, be it 60 FPS or even higher, as well as how determined you are to achieve it. Most importantly, do you find yourself sacrificing the experience offered by the title in question? Sound off in the comments.

Rich Edmonds
Senior Editor, PC Build

Rich Edmonds is Senior Editor of PC hardware at Windows Central, covering everything related to PC components and NAS. He's been involved in technology for more than a decade and knows a thing or two about the magic inside a PC chassis. You can follow him over on Twitter at @RichEdmonds.

  • I'm trying to get a 144 hz monitor and I've been told if those frames drop around there its less noticble than dropping from 60
  • Not necessarily, it depends how much it drops by. Will also depend on the game you're playing (you're more likely to notice it when playing an FPS as opposed to a turn based strategy game fo example). Technologies such as Gsync and FreeSync also help alot too.
  • Get a gsync/freesync 144Hz monitor and you will find framerate drops are acceptable. Of course, you don't want to be dropping under 60 much if you want it to look good, but the trick to make sure drops don't look so bad is syncing with the refresh.
  • Should the Xbox One X eliminate this problem for the masses?
  • Nope, it will be still a 30fps mainly console.
  • It should, but it probably won't.
  • Consoles are not PCs (where you will have billions of combinations of hardware, driver, user-installed-services-and-softwares, etc), just make sure you don't go over the cpu and gpu budget. If Titanfall 2 runs 6k on xb1x are true, should be fairly easy with most current gen games. Games in the next 2 to 3 year will prob still need to run on ps4 and xb1, you won't need to worry about xb1x's cpu. On the gpu's side... you'll prob have 2 set's of resources, assets, physics, effects, etc for older console and xb1x. It's up to dev if they want to go over the gpu budge, aim for better visuals over fps.
  • Probably not, but not because of the console itself.
    You might have noticed it before, most TVs have awful input lag (the time it takes the image to actually show up on screen after being sent to the TV), even when they claim to be 100 Hz, or something like that, and quite a few of them don't even really do 1920x1080 (try connecting a PC to them, you'll notice that the edges of the image are cut off).
    What you end up with is TVs that claims it has, say, 5 ms response time , but then has a input lag of 30 ms, which means it's physically incapable of doing 60 FPS.
    That's why, many newer TVs have a "game" or "PC" mode, which is specifically designed to reduce input lag. But that won't help if the TV doesn't remember to turn it on automatically and you don't do it either. And it won't help if you connect everything to one of those home cinema or AV receiver thingies.
    So no, the Xbox One X won't help with that.
  • Apparently, the trick is to turn down the anti aliasing settings, as this is hardware intensive but adds little to the game as its difficult to see the difference in fast paced games where FPS matters the most.
  • Frame timing > FPS > at least 1080p resolution > graphic settings That's how I do it. And never use ultra settings on newer games even if it would be no problem at all. Ultra settings add mostly nothing to the game except of great screenshots with 200% zoom.
  • I used to find myself concerning over frame drops because of my low-end hardware, which then disrupted my experience. Now that I've made upgrades and I can maintain 60fps (and above in some games) so that I no longer have to worry about that, and can focus on the game and enjoy it :D
  • I've had a **** pc for my entire life, so I'm always having the worst experience - 10 to 25 fps on lowest settings. As long as it's playable, I'm ok.
  • I got over the entire speeds & feeds thing a LONG time ago.  I no longer care about chasing that.
  • My 4 year old gtx 780 still does a great job running everything I want to play. Some examples: CSGO on ultra settings @ 150fps+, GTA:V on high settings 60-100fps. Basically anything that's less graphically intensive than gta I can play at ultra with a shitload of fps. No need to "sacrifice game enjoyment to chase high fps" like are you playing on a potatoe?
  • Enjoyment = High Framerates.
  • It depends on the type of game
  • This discussion needs to be brought up more in gaming circles.  Being a long time PC gamer, I constantly oscillate as I've grown older between the "Sit down and play" console world and the "never ending tweak fest" that is PC gaming.  The rather poor Xbox One performance this generation somewhat forced me back into PC gaming (that plus steam deals and the backlog) - but I think Xbox One X might be enough of an upgrade to finally swing me back over to be a primary console gamer again. Chasing 60fps (or higher) is a ridiculous game - and doesn't matter as much as is touted.  I agree that when you go from playing 60FPS content, back down to 30 - it's jarring, but you adjust after about 10 minutes of play.  For games that really matter (FPS generally) 60 is the min, but for the majority of other genres, it's no big deal.  Sure, I would love 60FPS across most every game, but I've realized the power required to guarantee that, especially in demanding titles, is very hard to achieve without constant fiddling with graphical dials. TBH, frame rate is taking a backseat to advanced image rendering techniques and resolution.  I hate jaggies, and really like some of the advanced shadow (PCSS) or ambient occlusion techniques in the wild today.  I'm looking forward to the pin-prick sharpness of the Xbox One X on my KS8000 (combined with HDR). So I'll be in the minority of vocal online communities, but I believe I represent the informed "majority" of people who just want to game.  I would argue frame rate is largely completely ignored by 95% of the gaming community.  Eye candy sells more than frame rate, and we already know that devs who want 60 FPS will find it for the games its really needed in.
  • 60 is enought for me. If i can't get a game to run at 60 is because it is a bad ported game so i play the way it is. I only "waste" a few minutes on graphical options at start,wheen everythingg is set the way i like i never return there and also turn off the frame counter. This last one can take your attention from the game with the smallest frame drops.
  • Batman: Arkham Knight lol 🤢
  • Enjoyment is higher FPS
  • Its not like most of the people plays in 4K, so GTX 1070 or even 1050 should be enough for gaming under that resolution in 60 fps
  • Depends on the game... I tend to crank up the settings then dial back until the min is around 40 with an average of around 60. Some older games will go the full 144Hz but between that and 100 the difference is indestinguishable. 100+ is great for driving games.
  • 60fps lock rules.
  • I love my 60fps but i can't deal with variable frame pacing, so i usually try to keep frame drop to a minimum. Whether is 30, 45 or 60 plus i use a higher refresh 🔃 rate monitor with freesync like tech to help smooth things out.
  • Just thought someone should point out that 4K is 2160p, not 3840p. The number used is the vertical resolution, not the horizontal. 😊 3840p would actually be closer to the 8K standard of 4320p.
  • I'm happy with a 34" Ultrawide 3440x1440 - Everything in Windows @ 100% is the same size as it is on a 21-23" 1080P screen so it genuinely feels like you have a ton of space!
  • Reading this article I don't think the issue is 60fps, it's trying to push 4k. It's a far better "experience" getting 60 frames at 1080p than 30 frames at 4k. So long as people aren't trying to push the limits of their system and instead just focusing on what makes the game run well then your enjoyment should be sufficient.
  • I agree with this, and I would love the option for 1080P @ 60FPS over 4K @ 30FPS on Xbox One X - It will all depend on the developer though, obviously 4K @ 60FPS is the best but it's not achieveable in every single game. New Assassins Creed looks great. but it still dropped below 30FPS on the Xbox One X demo at E3 😭
  • Yeah the frame rate dips in that video were VERY noticeable, but I assume they'll be sorted before release.
  • I hope so, at least on One X it should be rock solid...maybe not so much on the current Xbox One based on past few AC games. I'll be playing it on an X hopefully anyway
  • Yeah same, although I should probably play Syndicate beforehand though, and finish Unity.
  • Had recently bought a 1440p 144hz monitor. Wasn't getting the FPS I wanted with my 980Ti in some games (more or less expected though) so I upgraded to a GTX 1080 when I found one on sale. When it comes to PC gaming I want an uncompromised experience, meaning getting the FPS I want while playing at max grapghics settings/resolution. If that means I need to upgrade something then I will (if I can afford to at the moment). I'm willing to live with a lower FPS (i.e not consistently hitting my monitor's refresh rate) but if it starts to affect gameplay too much then I'll start to compromise on the graphics settings.
  • 2160p (4K) FTFY
  • I prefer max visuals 30 than lower visuals 60. If I get something like 40-50 fps, I won't compromise. Of course, some games, like CoD, play weird, so I need 60 there. It's one of the reasons I won't get a 4K TV/monitor anytime soon. I'm fine with 1080p.
  • As others have said higher framerates=enjoyment. Research has already been done on this. People who play at 60FPS enjoy the game more. TBH, I have a hard time telling the difference. But my brother can tell instantly. Maybe the more I play at 60 the more likely I an to notice the difference going back to 30.
    Max details and 60+ fps
  • It depends on the game... I usually game at 4K res with close to 60 FPS, but I might bump down the res to 1440p or tweak the settings if I'm playing an intense FPS or something where framerate is important.
  • I run a GTX 550 Ti... think that kinda answers the question of this article :D
  • Pong @ 60 FPS FTW
  • Higher FPS means less input lag in most First person shooter. You may want to check Battlenonsense YouTube channel and see how it will impact your gaming experience.
  • I've always been on the higher end of GPU hardware due to obtaining a 2560x1600 u3011 years ago; the power required to push solid framerates at that res mandated two high end cards (until recently). I'm waiting until the 4k 120hz IPS panels land in Q3 this year before upgrading again, but know I'll be chasing performance once again. Unfortunately SLI and Crossfire are broken by nature (unless the title uses AFR, like Blops3) and DX12 native multi adapter support is very niche. The second 1080Ti usually sits doing nothing. In 12 months I'm hoping that console ports will see PC players benefit from multi GPU support built into the game engine (PS4 pro uses two GPUs).
  • I used to but my R9 290x is doing a pretty decent job playing games at 1080p at ultra settings with 60plus frame rates. Granted the games I play aren't that demanding but I prefer gameplay not graphics. But they come hand in hand as it helps with immersive gameplay. Lately I started playing some fallout 4 with the hi-res dlc. I've only ran into issues because it won't be a bethesda game if it wasn't a bug ridden mess and some of gameplay mechanics make no sense whatsoever. This is where the PC modding community comes into their own. For example the vanila settlement mechanics vs sim settlement mechanics (a mod). It completely changes the gameplay. Then the true weather mod, another gameplay changing mod (I have the storm following ghouls set to 100% and 30 per spawn for the challenge lol) and provides an insane depth of realism (best on ultra graphics).
  • Nah, I tend to prefer the aesthetic qualities of a game. In other words, the more photorealistic it is, the better it is for me. When it comes to the overall appearance of a game, I've always prioritized graphical fidelity and physics over pure frame rates. Ideally, I think the sweet spot is at least 60 FPS for smooth gameplay. Beyond that I couldn't really care less unless I'm wearing VR goggles (which I never have), in which case 90 is probably what you'd want at the very least. But I don't really care for the benchmarks that are all about FPS (that is CPU benchmarks). If I'm gaming at 1080p and my hardware can maintain 60 FPS at max settings regardless of scene, then that's good enough for me.