Quick Links

For us PC gamers, there's a desire for the latest and greatest hardware, and there's a definite cachet to having the coolest, fastest rig on the block. But in today's PC gaming market, there's really no reason to shell out for the most screamingly fast (and expensive) graphics card you can fit into your case.

I can already hear a million mechanical keyboards blasting out a reply in the comments, but let me explain myself. Don't get me wrong: if a $700 graphics card is what you really, truly want, and you have the disposable income to afford it, go for it. But after you read the following points, you might just decide that a cheaper card and the savings therein are worth considering.

Video Game Graphics Are Super Efficient Now

Related: Why Some Games Suck After Being Ported From Console to PC

Pretend, for a moment, that you're a game developer. You want as many people as possible to play your game, because that means that as many people as possible will buy your game. Typically that means developing it for multiple consoles (easy, mostly-static hardware targets with millions of users) and a large portion of the PC gaming market.

But that last point is much more nebulous. Just between the five major components of CPU, GPU, motherboard, hard drive or SSD, and monitor, there are millions of possible permutations of a "gaming PC." It's more or less impossible to design a game for any single one of them, so developers try to hit broad performance targets. And those targets generally include lots of medium- and low-end graphics cards, because video game publishers and developers like to sell video games to as many people as possible.

Graphics engines, the software bread in your video game's entertainment sandwich, are now tuned and updated to run extremely well on low-end hardware. Popular titles like Overwatch, Rocket League, and DOTA are made so that they can run on low-end systems. And sometimes even the big, bombastic, graphics-heavy games can still run ridiculously well at the low-end: here's the 2016 version of DOOM, an extremely fast and visually complex shooter, running at 30 frames per second on low resolution on a Surface Pro 2. That's with an integrated Intel HD 4400 GPU---a little tiny baby laptop graphics processor!

Not convincing? Fair enough---if you're reading this article you're probably thinking about a GPU upgrade for your full-sized desktop PC. Here's the same game running on the NVIDIA GT 730, a barely-there graphics card that sells for about $60.

Now let's assume that you have a slightly more flexible budget, one that stretches to $150. That's a bit beyond most people's impulse purchase range, but still well below the cost of a game console or even a mid-range GPU. This video has the Radeon RX 560, about $130, in a mid-budget setup. It's missing some of the top-of-the-line special effects you crave, but now you're getting 1080p resolution and above 60 frames per second, which is all most monitors can display, all for less than the cost of Destiny and its DLC.

Which brings me to my second point.

Gaming GPUs Are Now Outperforming Most Monitors

Unless you've custom built your PC and selected your monitor specifically for high frame rate gaming, any mid-tier graphics card is going to exceed the capabilities of your monitor. What do I mean by that? It's all about the refresh rate.

The refresh rate of an LCD panel refers to how many times it updates its image every second. 60 hertz is pretty standard: 60 images displayed per second, a little more than double the framerate of standard television and movies. That standard is why gamers are so focused on 60 frames per second in their games. If your game is running at more than 60FPS on your 60 hertz monitor, it's literally impossible for your eyes to spot any loss in animation quality, unless it drops below that---not because your eyes can't perceive more than 60FPS, but because your 60Hz monitor is incapable of displaying more than 60FPS.

Monitors have gotten really inexpensive in the last few years due to economies of scale, but even so, most people buy something with a panel that's no more than 24" diagonally with a resolution of 1080p. So even a visually intense game like DOOM can sit pretty on a sub-$200 GPU and deliver graphics that are better than your monitor is capable of displaying.

Of course there are exceptions to this. If you've bought a 4K or ultrawide monitor with resolution that's far in excess of 1080p, you will need something beefier. Ditto for expensive gamer-branded monitors that deliver refresh rates up to 144 hertz---60 FPS games will still look great on those models, but they'll look even better if you have the juice to bump it up to the monitor's maximum output. But if you have a more ordinary or older monitor and you aren't planning on updating it any time soon, you really don't need to spend more money on your graphics card.

Expensive GPUs Have Serious Diminishing Returns

The value proposition for graphics cards tends to shift with the available technology and the consumer market. But there's always a point at which spending more money gets you less power, because the super-expensive GPUs are made and reserved for people with high income and high budgets.

Hence the concept of "the sweet spot," the price where spending more money on your graphics card gets you less and less power for each extra dollar you spend. Depending on the year, the sweet spot generally sits at somewhere between $200 and $400. Let me give you an example.

 

Here's a GPU benchmark spread from TechSpot.com, testing Destiny 2 on 30 different cards from the cheapest on the market to the most expensive, all with identical PC setups. Not the plateau around the middle of the graph: between the GTX 1060, a one-year-old $250 card, and the 4GB version of the R9 Fury, a card that cost $550 when it was launched just a year earlier, the difference in frames per second is only 10. And again, both of them are well above that target 60 frames per second, even at a higher 1440p resolution.

Is $300 in price difference worth 10 extra frames per second? Not unless you're mining bitcoin in the background. The jumps in performance are better at the higher end, but the big increases don't come again until you're in the $600 range---and for that much, you could buy a $300 card plus a second major upgrade for your PC, like a roomy SSD drive or a bigger, fancier monitor. Also, consider that if your computer is more than three or four years old, components like your processor and RAM might be limiting the graphics capabilities of those expensive cards anyway.

Again, the point here isn't to discourage you from spending a ton of money on your GPU if you really want to. It's just to point out that there are probably more effective and efficient ways to put that money to use if you're on a budget and want your games to shine.

There's Always a Bigger Fish

With apologies for the Phantom Menace reference, keep in mind that once you spend a car payment's worth of cash on a new GPU, it won't be new for long. (As a matter of fact, that's pretty good advice if you're buying a car, too.) Somewhere between one and six months from when you bought it, NVIDIA or AMD will announce a newer, more powerful, more objectively desirable graphics card that will either cost the same and do more or cost less and do around the same.

General human nature makes us want the best stuff, and technology companies love exploiting that desire. But that only works if they're releasing new and improved models all the time---it's why Apple makes a new iPhone at almost exactly one-year intervals. And that's fine! New, cool tech is great, and as a general tech nerd, I enjoy it more than most people. But it's important to remember that a better, shinier graphics card or phone or car doesn't make the stuff that came before it less effective. Or shiny.

The point here is, if you're paying a premium for a graphics card because you want that self-satisfied feeling of having the best one on the market, or even the best one in its price range, that feeling will go away sooner than you think. And since it won't be that much better than the one you have, and the one you have isn't even that much better than the one you might have bought if you were being more frugal, that feeling might end up costing a few hundred bucks...before it quickly disappears.

Who Really Does Need A GTX 1080 Ti?

I'm glad you asked, because I didn't want to end this article on a downer note. To reiterate my points above, the people who will see a distinct benefit from a 1080 Ti or similar card are:

  • Gamers with a high-resolution, high-refresh rate monitor who have the hardware to support it
  • Gamers who have enough expendable income that a $600-1000 card won't affect their budget
  • Gamers who are passionate about having the objectively best hardware, all the time, and adjust their budget accordingly

That's about it. And if that sounds kind of dismissive or reductive, it isn't. This is a good thing! The PC gaming hardware market is at a great point where you have choices for fantastic performance at all budget levels. Just keep it in mind if you're considering a mid-range card that probably meets all your needs, or a high-end graphics card and a month of microwave ramen noodles.