FreeSync vs G-Sync: Which is best for you?

PC monitors
PC monitors (Image credit: Windows Central)

The decision boils down to one key choice: AMD or NVIDIA. The best graphics cards from both companies will let you play the latest games at varying levels of detail and resolution, but at their core, they all do the same thing.

When it comes to taking it up a gear, you have FreeSync, and you have G-Sync. Again, similar features, but the former is supported by AMD and the latter by NVIDIA.

If you already made your graphics card choice and you don't plan on switching, the decision is probably already made. But if you're still in the buying process, let's break down a few key factors that can help you decide.

What is FreeSync and G-Sync?

HP Omen 32

FreeSync allows the graphics card and connected monitor to communicate with one another to maintain a stable refresh rate that can be altered depending on what the graphics card is currently outputting. The result is a stable, super smooth experience and a variable refresh rate. But it's also not supported by all AMD graphics cards, though newer ones will be fine.

G-Sync synchronizes both the monitor and graphics card, thanks to an on-board module inside the display. The card and monitor will only run as fast as the slowest of the two can handle, depending on where the bottleneck happens to be. This means the net benefit is that the monitor will display every single frame produced by the GPU, thus resulting in a super smooth experience.

Ultimately the end result for both is a very similar experience: silky smooth gameplay. The two companies approach it a little differently, but the output is very alike.

The big difference between the two is that NVIDIA G-Sync requires monitors to embed a chip , thus affecting the retail price. We'll look at that more below.

Cost differences: Graphics cards

GTX 1070

There's an assumption that AMD cards are better for gamers on a budget, and while that's not entirely incorrect, you can't just take it as fact and leave it there. At the time of writing NVIDIA and AMD are targeting different markets. AMD is definitely towards the budget end, with the RX 460 through to the RX 480, while NVIDIA is covering a wider range with the entry GTX 1050 going all the way up to the new GTX 1080 Ti.

That doesn't mean AMDs cards are bad, far from it. The RX 480 is very capable, as is the RX 470. And if you shop around you'll find an 8GB RX 480 for around or not much over $200.

By contrast, the highest end NVIDIA card will cost from $699, though it's far and away more powerful than anything AMD has to offer right now. At a similar price to the AMD cards, you'll find the GTX 1050 Ti and the GTX 1060, both excellent units for 1080p gaming at mid-to-high graphical settings.

AMD has its RX Vega cards coming later in 2017, but right now its highest end card is available for the price of a mid-range NVIDIA one. So if the price is important, going with AMD will save you some dollars.

More: The best graphics cards for gamers

Cost differences: PC monitors

AOC Agon G-Sync

AOC Agon G-Sync (Image credit: Rich Edmonds / Windows Central)

When it comes to choosing a PC monitor, there's no ambiguity like there is with graphics cards: G-Sync panels cost more. There's absolutely nothing anyone can do about that as NVIDIA controls it due to the module it dictates must be present for G-Sync.

FreeSync, by contrast, has a clue in its name. It doesn't generally add to the cost of the monitor. Manufacturers could slip in a little increase, but compared to a G-Sync equivalent the difference will be significant.

For example, you can get FreeSync in even a cheap gaming monitor from folks like Viewsonic (opens in new tab) for $140. For a G-Sync monitor, you're looking around $400 for something like a 24-inch monitor from Dell (opens in new tab). It's the same across the board. The HP Omen 32 we recently reviewed supports FreeSync for $400 on a 32-inch 1440p monitor. There's no way it'd be priced so low if it had G-Sync.

The bottom line

On the whole, the biggest benefit to enjoying FreeSync gaming is cost. As things stand, you'll spend less on a current generation graphics card and less on a PC monitor to use it with. You may sacrifice some ultra power and things like 4K gaming, but you'll have silky smooth, crisp graphics and money in your pocket.

If your gaming tastes are for the bleeding edge, the latest, greatest, most powerful of everything, then you're going to get a NVIDIA card. You're already going to spend more on that, and then you'll spend more on a G-Sync monitor to go with it.

The end experience between the two systems is very similar, but how much you've spent to get there could be very different.

Richard Devine
Managing Editor - Tech, Reviews

Richard Devine is a Managing Editor at Windows Central with over a decade of experience. A former Project Manager and long-term tech addict, he joined Mobile Nations in 2011 and has been found on Android Central and iMore as well as Windows Central. Currently, you'll find him steering the site's coverage of all manner of PC hardware and reviews. Find him on Mastodon at

  • I find nVidia's approach of requiring their chip to be used as a good step to ensure a consistent experience for G-Sync. If you leave it to the monitor manufacturers to decide on the hardware that is used for FreeSync connectivity you have a greater chance in the use of substandard components.
  • On the other hand, Nvidia's proprietary tech drives up the cost. FreeSync is an optional part of the DisplayPort standards, so implementing it means it still has to follow the guidelines to meet the specs. As far as I recall, and correct me if I'm wrong, but FreeSync requires little or no extra hardware. I've seen no evidence that FreeSync is in any way subpar compared to G-Sync.
  • Really? You do realise that the Freesync monitors have the same scaler chip in them too, right? Just that Nvidia put their own in to make sure it meets the required standard whereas AMD leave that to the monitor manufacturer. This means some of them choose to save money by putting in a poor scaler chip and the result is a bad Freesync range and experience. Freesync is great, so long as you buy a decent Freesync monitor which will be a comparable cost to a Gsync monitor. Just because you CAN buy a budget Freesync monitor, doesn't mean you should. You will get what you pay for.
  • I never recommended going right for a budget monitor. Regardless of quality, a comparable FreeSync monitor will almost always be lower in price due to Nvidia licensing fees and proprietary scaler. You're paying more just to go team green, in most cases by several hundred dollars.
  • NVidia have already stated there are no licencing fees, that's a myth. The only difference in the price is the cost of the scaler chip. Freesync or Gsync is beside the point. The scaler chip is crucial, so buy a monitor with a good one. Just make sure you check carefully with Freesync monitors because the cheap ones everyone mentions as an 'advantage' are not where Freesync shines. Don't get me wrong, I love my AMD kit, but only expensive Freesync = Gsync whilst cheap Freesync just = cheap.
  • You keep taking about cheap freesync. But AMD freesync has strict standards like Nvidia. Those cheap ones are just adaptive refresh rate monitors, without the official freesync approval. I have noticed that some of them will still throw out the freesync name without AMD's certification, but that doesn't make them official. There is a list of official freesync monitors on AMD's site. Adaptive refresh rates is a new open standard that freesync is built on, I think mostly just requiring a certain quality to get AMD's stamp of approval, but not 100% sure. Also, AMD cards support regular adaptive refresh rate technology too, even if it's not AMD certified freesync. Another bonus for AMD in my book, not closing doors just cause their name doesn't get on something...
  • Without being a gamer, and having no idea what either of these were, I figured they were some kind of cloud file sync software.
  • Interesting, I hadn't considered the name play could suggest that. I guess when you're in on something you might easily assume others are in the know, also. Had a bit of a chuckle, thanks!
  • Just another note, You can crossfire two RX 480 for $400 and get really good perforance in 1080p and 4K
  • This works in theory but the experience of multi-gpu generally brings other issues, like micro-stutter and game compatibility. It's usually better to spend a comparable amount on a better single gpu.
  • You look too happy in that picture Richard. I definitely need to pick up that Omen display you got.