Even if you’re not building an entire new PC from scratch, upgrading your computer’s GPU (graphics processing unit) can be a daunting task. There’s a ton of different options to choose from, and it’s hard to know at a glance which ones are better, or which are most suitable for your particular needs.
In this guide, I’ll briefly explain the differences between Nvidia and AMD GPUs, to help you determine which GPU is right for you.
The great GPU race
Both Nvidia and AMD (Advanced Micro Devices) are GPU manufacturers based out of Santa Clara, California. AMD is a noticeably older company, having been founded way back in 1969 (as opposed to Nvidia’s 1993 founding). Along with its Radeon brand GPUs, AMD also designs and manufactures CPUs (central processing units) under its Ryzen label. Nvidia, meanwhile, focuses strictly on GPUs, most notably under its GeForce series. Several third-party companies help in manufacturing Nvidia’s products, but if it’s a GeForce GPU, it’s Nvidia.
What makes them different?
There are minor differences that separate the design architecture of Nvidia and AMD GPUs. AMD cards, for example, often have more processing cores than Nvidia cards. Nvidia’s GTX 1080 (a noticeably high-end card) has 2,560 cores, while its AMD competition, the RX Vega 64, has 4,096. However, having more processing cores doesn’t make one GPU better than another, though it does typically
Nvidia’s cards may have fewer cores, but they also have noticeably faster clockspeeds and efficiency benchmarks than AMD’s lineup. Given its improved efficiency ratings and more widespread name recognition, it’s little surprise that Nvidia’s GPUs tend to be more expensive than AMD’s cards.
So which GPU is best for you? Well, that depends on the sort of computing you want to do. Nvidia’s cards are generally more powerful at different tiers, but it doesn’t always make sense to pay the extra money if you don’t have to.
If you want to play the newest PC games at the highest possible graphics settings then it makes sense to pay more for cards like Nvidia’s GTX 1080 Ti ($600-$700 on average) or even the newer RTX 2080 ($790-$870 on average).
If, however, your graphics needs are more modest, AMD might be a better option. The same amount of money you’d spend on certain Nvidia GPUs could be better spent on equivalent AMD cards if super high-end graphics aren’t a big priority for you.
For example, Nvidia’s GTX 1050 Ti is a decent mid-range GPU which costs, on average, about $160. However, for that same price you could instead purchase AMD’s RX 570, a card which surpasses the 1050 Ti in most performance metrics. In short, Nvidia may dominate the high-end GPU market, but AMD has the edge when it comes to mid-range and budget gaming.
If you’re still not sure which GPU is best for you, thinking about the sorts of games you want to play (bigger AAA titles vs. smaller indie games) is a great starting point. Websites such as PC Part Picker and Logical Increments can also help you narrow down your choices based on budget and specific computing needs.
The future of the GPU market
The GPU market is continuously moving forward, but some stable trends have dominated the scene for years. Nvidia will continue to reign supreme as the premium option while AMD will keep fighting the good fight as the scrappy mid-range underdog. However, that all might change in just a few years when a third company, Intel, launches its own line of dedicated GPUs in 2020.
Currently, Intel designs and manufactures the integrated graphics chipsets found in many laptops, but its upcoming GPU products will be designed specifically for desktop computers. Intel has long been positioned as a third graphics-based company whose forte is mainly restricted to the laptop realm. Come 2020, however, AMD and Nvidia might have some fresh new competition to deal with.
It’s too early to tell how much Intel’s move into the GPU market will shake up the status quo (if at all), but it should at least benefit GPU customers by pressuring Nvidia and AMD to keep upping their game.