Skip to main content

Upgrading from NVIDIA's GTX 1070 GPU to GTX 1080 Ti: Is it worth it?

1080 Ti
1080 Ti

GTX 1080 Ti

GTX 1080 Ti

NVIDIA's GTX 1070 one of the best graphics card options, but its GTX 1080 Ti is an absolute beast. Or that's what we're led to believe, anyway. For less than the cost of a Titan Xp you can have performance that is very close, and on paper at least, the GTX 1080 Ti is a sizeable jump forward from the GTX 1070.

I just upgraded from the 1070 to the 1080 Ti, so I was interested to see just how big a leap forward it was.

See GTX 1080 Ti at Amazon (opens in new tab)

From GTX 1070 GPU to GTX 1080 Ti

The particular card I'm using (opens in new tab) is a fairly standard affair. It's not the Founders Edition, but it is a similar reference design model from Zotac with the blower-style cooler. It has no factory overclock or any of that fanciness. So you get 3,584 Cuda cores, 11GB of GDDR5X memory and a 352-bit memory bus.

Full NVIDIA GTX 1080 Ti specs (opens in new tab)

I needed to upgrade the power supply in my Alienware Aurora R5 first since Dell only provided a 450W unit. A 600W minimum is recommended for the GTX 1080 Ti, but for my own choice I went for the EVGA Supernova G3 750W (opens in new tab).

So, on to some benchmarks. Comparisons were all made at 1440p on whatever the maxed out settings were in each of these games. The GTX 1070 was never pushed as a card for 4K gaming. I also use a 1440p monitor with my PC so it's the fairest comparison I could make. The rest of the PC contains an Intel Core i7-6700 processor and 32GB of RAM.

GameGTX 1070GTX 1080 ti
Gears of War 467.2 FPS avg109.1 FPS avg
Rise of the Tomb Raider71.2 FPS avg113.1 FPS avg
GTA V62.7 FPS avg73.7 FPS avg
PUBG41.6 FPS avg68.9 FPS avg

When it comes to synthetic benchmarks, I ran both cards through 3DMark's Time Spy for DX12 and Fire Strike Ultra.

The numbers speak for themselves, and with both games and in the synthetic benchmarks I'm experiencing a huge increase in performance. It varies as to just how much, but it's never a case of being "just a bit better."

It's a lot better in most cases. The smallest performance increase comes in GTA V, which seems to be fairly CPU intensive anyway.

PlayerUnknown's Battlegrounds offered the biggest test so far at maxed out settings, but the optimization on that game is still way off. I play it and over 10 million other people play it, so I was curious to see how much I could get from the GTX 1080 Ti.

Gears of War 4 performance figures on the GTX 1080 Ti.

Bottom line

Whether it was a necessary upgrade isn't a question, because I already knew it wasn't, especially not when this particular card costs around $800. The GTX 1070 has served me well and never really left me wanting for more. The GTX 1080 Ti, though, is in a different ball park. Part of the reason to upgrade was in preparation for a high-end VR future, but even in gaming, you're getting insane performance from one of these cards without it even breaking a sweat.

If you do make the upgrade, you'll know where the money went.

See GTX 1080 Ti at Amazon (opens in new tab)

Richard Devine is an Editor at Windows Central. A former Project Manager and long-term tech addict, he joined Mobile Nations in 2011 and has been found on Android Central and iMore as well as Windows Central. Currently you'll find him covering all manner of PC hardware and gaming, and you can follow him on Twitter and Instagram.

25 Comments
  • Hell I went from a GTX 1080 to a 1080TI and the difference was worth it to me.
  • It doesn't make any sense to get a card in preparation for VR future. Let the VR future (if there is such a thing) arrive in 2-6 years and THEN buy a GPU that will smoke the 1080Ti. Preparation purchases are really, really bad decisions. Reactionary purchases, aimed at addressing a well understood need are better because you can then buy exactly what you need versus an unknown. For all you know the 1080Ti will lack the juice for the second gen VR sets or maybe there will not be such thing because VR games suck.
  • Not really. VR is already here. By the future I mean the stuff I'm going to buy probably in the next 12 months. Project Cars 2 in VR...oh yes. And you know, why get a 1080 when a 1080 Ti is here.
  • Exactly, people that can't afford something will say anything for justification for not buying it lol. I still have a 1070, but if Fallout 4 VR struggles with it I will snag a TI myself. ^^
  • 100% agree.  VR is quickly becoming the next "3D TV".  I have friends that spent gobs of money on this stuff and now they are collecting dust.  The games are simplistic and once you get over the VR part of it they are just lame.  IF....big IF and when VR takes off the 1080ti will be like a 3gig 1060 and you will need to upgrade...IF VR ever turns out to be anything. Looking at those benchmarks, any REAL noticible benifit, as in while playing those games MIGHT be in PUBG.  I say might because on my Zotac 1080GTX on my ASUS 1080p 144hz monitor is laggy and never breaks 80 FPS.  The game is horribly optimized.
  • 1.  I'll say the same thing that I said to the people who derided 3d (TV and gaming) into the ground: please wear a patch over one eye whilst gaming so you have the authentic 2d experience.  Actually, keep it on everywhere, as the idea of depth perception obviously is not for you.   2.  A mere 760 runs my racing stuff in either 3d x 1 monitor or 2d x triples at 120fps locked (1 BenQ and 2 Acer monitors; they're not EDID matched so 3d Vision Surround is not running yet).  If you've found a game so badly written that it never breaks 80fps on a 1080p monitor with a GTX 1080 card, find a different game...
  • Don't fancy donating that 1070 to me now you aren't using it do you?
  • Anybody tested this with the Razer Core? I have the Razer Blade Stealth (Late 2016) with the Core and a 1060 6GB. Wonder if it would be worth it going from a 1060 to a 1080 or 1080Ti, even with the performance loss of using the Core.
  • I'm sure Linus Tech Tips does a 1080Ti performance review using the Core.
  • It all depends on the resolution your pushing.  For a 1080p monitor I am going to say NO it is not worth it, as most games can get 60+ FPS with a 1060 6G at 1080p.  If you are doing 1440p and want high-ultra settings I would go with a cheaper 1080GTX.  4k monitor with high-ultra get the fastest card you can buy and it may not be enough for some games.
  • Our own Dan Rubino runs a 1080 Ti in his Razer Core.
  • The Razer Core is limited by the Thunderbolt 3 bandwith, which is about equal to a PCI Express x4 slot - which just isn't enough to push a high end GPU. As such, anything above a GTX 1060 starts hitting pretty severe bottlenecks; various reviews have shown that even comparing a 1070 to a 1060 offers very little gain. It gets even worse once you start plugging things into the USB hub on the back of the Core, especially things like external drives. Basically, until Thunderbolt 4 comes out and give you a PCIe x8 slots' worth of bandwidth, it's only suitable for a midrange gaming PC.
  • They should have implemented two Thunderbolt 3 ports needed for this, especially knowing the bottleneck for anything above 1060.
    I would happily of had one more cable to get the full capability (and two Thunderbolt 3 ports on notebooks would be much better too).
    Would have sold the Blade for me as a nice desktop/portable solution.
  • "reason to upgrade was in preparation for a high-end VR future" Since that future isn't here yet, it would be wiser to wait for the next version of Nvidia cards due out sometime in Q1 2018;  
  • and wait for Q2/Q3 for them to actually be avalible and at a decent price.
  • It is already here and by future he probably means microsoft MR that is lauching next month. >_>
  • In tech if you wait, you always wait. Things have resale value. My 1070 pays for about half the cost of my 1080 Ti. Which isn't bad considering it came with the PC. New card comes out that I want, sell the 1080 Ti. And so the circle continues.
  • LOL wow.  Still rocking an RX470 and does great for me.  Just cannot justify the cash for a 1080/1080ti.  I just don't think it will increase my enjoyment of gaming when i already get 60+fps.  And 4k gaming.. I feel like it's a waste.  1080 is fine.  1440 is starting to stretch it.  Almost nothing will really take advantage of all those 4k pixels when the textures are constantly being scaled so they will never be razor sharp enough to take serious advantage of all that 4K resolution.  A still photo at 1:1 pixels will be fabulous at 4K, but games with moving action with scaled textures and purposely added motion blurring effects?  You're just not gonna benefit from the extra resolution.
  • Thats exactly the same BS people used to say when the switch from SD to HD was taking place.
  • No, there is a definite increase from SD to HD TV.  But from REASONABLE viewing distances, the diff between 720 and 1080 is marginal.  And from 1080 and 4K is even slimmer.  It's the law of diminishing returns.  We've passed what is truly recognizable by the human eye in a NORMAL use case scenario.  Sure, you can put your face right to the screen and say you see a diff, but that is just not a normal use case.
  • I game on a 1440p 32-inch monitor currently. No way in hell I'd play on that sized screen at 1080p.
  • Well honestly, 1440p at 32 inches at desktop monitor distance is a good use case scenario.  4K on a 60" TV sitting 10ft away... Well... Let people waste their money. The HDTV talk was in response to a poster above.
  • You don't need to be running 4k to take advantage of this card - and in fact you can get more out of one if you don't. I have an Acer x35 (2560x1080 @ 200hz), and am overwhelmingly happy with the 1080Ti I forked out for - I am pulling over 160+fps in most game with every feature turned up, but there are some games that even with my setup I can't run full bore (Ghost Recon Wildlands and Deus Ex MD both have to be tuned down to get 60fps). 4k is still early days, and while it's spectacular to look at, the fact that 100Hz+ gaming monitors are just now starting to show up and HDR is still far from mature should let people know that just because you can do something, doesn't mean it's wise to adopt so early on. I justify paying through the nose for my 1080 Ti with the same reasoning I did with the 780 Ti it replaced: it will last me 3 years, and I won't have to start turning off many features in games before the 2nd year is up. It's a big investment, but to me being able to play the games as they were meant to be seen by the devs over that long a period makes it worthwhile.
  • oops double
  • Very nice Richard.