Why aren’t people buying GPUs? Nvidia has the answers to its own problem
Nvidia’s CEO just took a hefty pay cut
When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.
Did you buy anRTX 4090? If you clicked on this article, you’re statistically more likely than most to own thebest graphics cardin the world right now - but that chance is still pretty darn low.
It’s been a tough year for big tech so far, withIntel reporting record lossesandlaptop sales down across the board. Things look bad; in fact, a recent report found that more than three-quarters of tech industry employees in the UK areunhappy with their jobs.
Nvidiais no exception, it seems: people just aren’t buying GPUs like they used to, a report fromThe Registernotes. CEO Jensen Huang has taken a hefty $2.5 million pay cut as a result of the company falling short of its targets (though his compensation package still adds up to more than $21 million, so don’t worry about himtoomuch).
The graphics card giant has just released its staggeringly comprehensive2023 annual review, in which it noted that sales had been impacted by ‘economic headwinds, geopolitical tension, and a product supply chain that swung from severe shortage to excess’.
It’s no surprise that people are spending less on tech - especially high-end GPUs - in the current economic climate, but I think there’s more to unpack here. We’re well into the life cycle of Nvidia’s next-gen RTX 4000 cards, and their performance is impressive; we should be seeing Nvidia thriving, not struggling. So what’s the issue?
An unnecessary escalation
Now, I’m not saying I expected to see the same level of GPU fervor we witnessed with the RTX 3000 series. That was, after all, a GPU generation that arrived during the height of the cryptocurrency craze; a dark time when it was borderline impossible to buy a graphics card for a reasonable price thanks to crypto miners and scalpers sucking up all the available stock.
But even in the immediate wake of thecrypto crash, GPU sales remained strong for a while - gamers were excited to snap up a powerful graphics card that wouldn’t cost them an exorbitant amount, andgaming laptopsales were still solid. So what went wrong?
Get the best Black Friday deals direct to your inbox, plus news, reviews, and more.
Sign up to be the first to know about unmissable Black Friday deals on top tech, plus get all your favorite TechRadar content.
Let’s take a mosey on over to theSteam Hardware Surveyto see what pearls of wisdom it might impart. Well, not only am I not seeing a single RTX 4000 GPU at the top of the leaderboard (the flagship RTX 4090 is all the way down at #48), but the most popular picks tell a fascinating story.
In the number one spot is the GTX 1650 - a four-year-old GPU barely capable of running the latest triple-A games at 1080p! Going down the list, it’s much the same story: the first card I’d be willing to play at a higher resolution on is theRTX 3060 Ti, in the #7 spot. The fact is that most PC gamers are still playing at lower resolutions - the need for4Kisn’t what Nvidia seems to think it is.
Yet Nvidia is charging ahead with more and more powerful GPUs,claiming that Moore’s Law is deadand that we can all look forward to spending increasing amounts of cash on new graphics cards. Even the cheapest next-gen card available as I write this, theRTX 4070, still costs $599 - a price out of reach for most gamers.
Is Nvidia its own worst enemy?
In short, I think Nvidia is aggressively forging ahead with more powerful hardware at a time when most gamers would rather seecheaperhardware. That’s why GPU sales are falling, combined with the factors beyond Nvidia’s control noted in the annual review document.
It’s also worth pointing out, though it might be blindingly obvious to some, that anyone with both the funds and the intention to buy a high-end GPU will already have one. The RTX 4090 was very hard to find (especially at retail price) back when it first launched; later cards in the same generation didn’t see the same rush of sales.
I don’t think the upcomingRTX 4060 Tiwill do enough to dissuade me of this notion, either - but that doesn’t mean Nvidia should be worried. In fact, this whole situation isn’t even a problem Nvidia needs to solve. Despite its recent losses, Team Green is uniquely positioned to make awhole lot of moneyselling the most powerful GPUs it can possibly produce over the next few years. Why is that? AI, baby!
Nvidia has already said that it’sgoing all-in on AI technology, and it’s increasingly easy to understand why. Training machine learning models like the phenomenally popularChatGPTrequires an absolutely ludicrous amount of processing power, and Nvidia might’ve stumbled into the perfect solution: tensor cores.
I’m feeling a little tensor
What’s a tensor core? Well, it’s a specialized GPU core that Nvidia created to help power its DLSS upscaling technology - that’s Deep Learning Super Sampling. The clue’s in the name: tensor cores power AI-assisted resolution upscaling for your games, and Nvidia has been making them for three generations of consumer graphics cards now.
This hardware wasn’t built with training chatbots in mind, but it’s AI-oriented tech - and with the current explosion in AI popularity, Nvidia is taking things very seriously. There’s already a lot of money in AI development, and that’s only going to expand; unlike cash-strapped PC gamers, corporations likeOpenAIhave comparatively fat wallets that Nvidia is no doubt hoping to dip into.
Of course, that’s bad news for gamers, at least in the short term. It’s possible that Nvidia might decide to get out of the consumer graphics market entirely, focusing on making crazy-powerful enterprise GPUs for training the AI models of the future.
But if that shocking twistdoescome to pass, it could actually be good news in the long run. For starters, Nvidia has been the big dog of the GPU space for years now - letting someone else take a shot at the PC gaming market could lead to some interesting developments. A few years back I’d be concerned about a lack of competition forAMD’s Radeon graphics line, butIntelhas thrown its hat into the ring with Intel Arc GPUs and evenAppleis making serious strides with its M-series silicon.
But more importantly, whatever technology Nvidia develops and perfects will bleed through to the consumer space: even if it’s not coming from gamers, the potential profits from selling hardware to AI firms will mean more investment from Nvidia in the GPU industry as a whole, and that’s a very good thing.
Give Nvidia all the money, I say. Team Green has given us themost powerful consumer GPU in the worldon more than one occasion, and while those graphics cards aren’t going to get more affordable, the ones that come after will be. New consumer GPUs for gamers is good; improving GPU technology as a whole isgreat.AI has the potential to revolutionize the gaming graphics industry - we might just need to be a bit patient first.
Christian is TechRadar’s UK-based Computing Editor. He came to us from Maximum PC magazine, where he fell in love with computer hardware and building PCs. He was a regular fixture amongst our freelance review team before making the jump to TechRadar, and can usually be found drooling over the latest high-end graphics card or gaming laptop before looking at his bank account balance and crying.
Christian is a keen campaigner for LGBTQ+ rights and the owner of a charming rescue dog named Lucy, having adopted her after he beat cancer in 2021. She keeps him fit and healthy through a combination of face-licking and long walks, and only occasionally barks at him to demand treats when he’s trying to work from home.
Intel Battlemage rumored for December – could new budget GPUs win over gamers neglected by Nvidia and save the Arc brand?
Nvidia RTX 5090 Ti suddenly pops up – and RTX 6000 GPUs are mentioned in trademark filings too – but don’t get excited
Anker Nebula Mars 3 review: A powerful and truly portable projector