Ever since 3dfx launched the original Voodoo, and in virtually every case, how a PC performs in games has been tied to its graphics card. But choosing a video card isn’t all that easy. The purpose of this guide isn’t to answer whether you should buy any specific product from AMD or Nvidia, but to create a framework that you can use to determine how much GPU performance you need, and how much you should spend to get it.
We’re going to assume that if you’re gamer enough to know you need a new graphics card, you’re gamer enough to at least have a small preference for a particular GPU manufacturer.
Determine your goals and budget
The first thing you need to determine before buying a card is what you want that card to do for you. Do you prefer turn-based games like Civilization, which tend to be less hard on the GPU, or do you play cutting-edge first-person shooters? Are you happy with an existing 1080p display with no plans to upgrade, or do you want a GPU that can push cutting-edge technologies like 4K and VR?
Here’s a good rule of thumb: Think about whatever system or GPU you currently use for gaming and figure out how long you’ve owned it, what you paid for it (or at least the graphics card), and whether you’re still happy with what you’ve got. AMD and Nvidia tend to offer steady progression across product families over time, at least at the midrange level and above. If you typically buy a new GPU every three years for $250, you’ll probably be happy targeting that price point if nothing else about your use case has changed.
If you’re curious about emerging technologies like VR, 4K, or future as-yet-undeveloped technologies, here’s another general assumption you can make: Any technology still shiny enough to be considered new will typically require top-end horsepower. The PC industry has launched multiple high-end initiatives in the last 10 years, from PhysX and 3D gaming to multi-monitor gaming, 4K, and now, VR. The one thing all of these technologies had in common is that all of them required at least somewhat more firepower than a typical GPU at launch.
Once you’ve got an idea what it is you want your PC to do, you can start researching whether or not your budget matches those capabilities. Right now, $200 GPUs can deliver solid 1080p gaming and $300 GPUs are suitable for 1440p, but this depends on the game and your own desired detail levels. If you’re happy playing everything on Medium, then $200 might be enough for 1440p gaming. If you want to maximize eye candy, $300 is probably a better target for that capability.
Specs, capabilities, and VRAM
Once you’ve got a budget in mind, you’re ready to look at some potential cards. A quick peek at NewEgg illustrates why plenty of people find this overwhelming — NewEgg lists 140 graphics cards for sale between $200 and $300 alone. Even if you limit yourself to only the GPUs that NewEgg sells and ships itself, you’ve got 41 options to choose from.
Here’s the good news: While this might seem like an overwhelming number of options, there are ways to simplify the process. First, be aware AMD and Nvidia use completely different numbering systems with no relation to each other. You can safely assume thatwithin each family of products, the number schemes will generally denote higher performance. For AMD, the R9 390X is faster than the R9 380X. Nvidia’s GTX 980 Ti is faster than the GTX 980. Having a budget in-mind will also help you limit your criteria.
Even within the same video card family, however, there’s often a wide variety of product SKUs available. Right now, the cheapest brand-new GeForce GTX 960 on-sale at NewEgg is a $169.99 card from Zotac. The most expensive GeForce GTX 960 is a $219 GPU from EVGA. The first step to determining which GPU is the better deal is to compare their base stats:
The Zotac card has 2GB of RAM and a boost clock of 1240MHz, while the EVGA has 4GB of RAM, a boost clock of 1342MHz, and a free copy of Rise of the Tomb Raider. Whether the EVGA card is worth the price premium compared with the Zotac comes down to how much value you assign to the included game bundle and the additional RAM. The EVGA GPU is a modest 8% faster by clock speed with the same memory interface and RAM speeds — probably not enough to give it a big kick all on its own.
Speaking of VRAM, this is a good time to talk about it. Manufacturers often offer different amounts of VRAM as a way to distinguish between products and to squeeze out a little extra profit. In other words: Buying more VRAM often isn’t worth it. The flip side to this, however, is that buying more VRAM can improve a GPU’s performance in scenarios where games are hitting a memory bottleneck (as we may be seeing in Quantum Break right now). Below $150, there’s almost certainly no point in paying more for additional VRAM — low-end GPUs just aren’t powerful enough to need the memory, pretty much no matter what. Above $150, there may be a case to be made for additional RAM, and a quick perusal of NewEgg shows us that Gigabyte sells a version of the GTX 960 that’s only $185 — just $15 more than the 2GB Zotac card and probably worth the extra cost.
The reason it’s important to take a little time to look at which GPUs are available at which price points is because manufacturers sometimes have overlapping products at various price points. There’s currently some overlap, for example, between AMD’s R9 380 and the R9 380X — and between the two, the R9 380X is the faster card with significantly more RAM.
We’ll say one more thing about VRAM specifically, since we seem to be seeing a shift on that front. If you’re investing more than $200 in a new GPU, we’d definitely pick up a 4GB card if possible. We expect next-gen cards from AMD and Nvidia will both standardize on that amount of memory.
Resources and research
The last major piece of the puzzle involves the technology cycle itself. As we’ve covered elsewhere, it’s actually a good idea to wait a little bit longer before upgrading your GPU (if you can), because both AMD and Nvidia have major refresh cycles coming up. The transition to 14/16nm FinFET GPUs is expected to be offer a significant improvement over current cards, so how do you make certain you’re buying on the right side of an introduction if you don’t follow these trends on a regular basis?
The simplest method is to check Wikipedia’s page for AMD and Nvidia GPUs and check the dates on when the last major releases occurred. For Nvidia, the GTX 970 and 980 are now over 18 months old, while even the GTX 980 Ti and Titan X are coming up on a year. AMD’s R9 300 family is only slightly younger. There are nuances that this high-level check won’t address (AMD’s R9 family is based on GPUs that are actually several years old), but again, we’re concerned with the big picture and broad overview here. Tech website coverage can be used to fill in the gaps as far as when launches are likely — major GPU events are well-covered across the industry and neither Nvidia nor AMD tends to launch products out of the blue.
If you want to know how one specific GPU may compare against another, we highly recommend Anandtech’s Bench. It doesn’t contain every GPU or every game, but it has a strong selection of both and can be used to compare graphics cards across generations to give you at least an idea what kind of performance increase you might get from an upgrade.
Remember: Early adopters always pay a premium for the privilege. If you bought a GTX 780 Ti two years ago and jumped on the 4K gaming trend right as it was getting started, you’re probably already shopping for a faster, higher-end video card. Midrange cards aren’t the sexiest or fastest, but they tend to deliver the best bang for their buck.
Let us know if there are topics you think need to be unpacked or if things are unclear. As a long-term guide, this is an article that’ll likely evolve over time.