The post-GeForce era: What if Nvidia abandons PC gaming?

Imagine it’s the year 2030 and Nvidia has just announced its newest RTX 7000-series graphics cards. But the cheapest of the cards is priced over $2,000 and the top model is nearly double that. The series offer minimal uplift on rendering performance, but they’re incredibly good at accelerated upscaling and frame generation. Plus, memory bandwidth is almost double over the last-gen models. Let’s continue the hypothetical: Nvidia’s new xx60-series cards aren’t expected for months while Nvidia stockpiles enough defective GPUs. But don’t worry if you can’t afford these new cards or don’t want to wait. Why? Because GeForce Now offers the full upgrade right now for an “affordable” monthly fee, especially with an annual sub locked in. I wrote the above as a nightmare scenario, but it’s odd how close it sounds to the launch of the RTX 50-series. It’s a history that seems likely to repeat and accelerate as Nvidia’s gaming division becomes an ever-more-minor side hustle to its AI initiatives. Nvidia could effectively give up on gaming in the near future, and that might be the most financially sensible thing to do if the AI bubble doesn’t burst. But what would happen if they did? Just follow the money The numbers behind my pessimistic prognosis paint a stark picture. Nvidia’s Q3 2025 revenue topped $57 billion. Guess how much of that money came from data centers? A whopping $51.2 billion. That’s just shy of 90% of its total revenue and represents a 25% increase over the previous quarter and a 66% increase year on year. How much revenue do you think Nvidia pulled in from gaming? A measly $4.3 billion by comparison. That’s down 1% on the previous quarter, and that’s despite having the most powerful graphics cards available and with stock and prices being far more favorable than they were earlier in the year. It’s still up 30% on last year , but the difference in potential between data centers and gaming is staggering . Nvidia Indeed, gaming makes up less than 8% of Nvidia’s total revenue as of now, and although the overall income from gaming continues to increase, it’s miniscule in comparison to its data center take. Bullfincher highlights how quickly that’s changed , too: just a few years ago, gaming represented over 33% of Nvidia’s total revenue. Where do you think it’s going to be in another five years? Assuming the AI bubble doesn’t pop as catastrophically as it could, gaming is going to become a tiny footnote on Nvidia’s balance sheet. Will Jensen Huang even bother doing gaming hardware keynotes at that point? Mark Hachman / IDG Nvidia might be the biggest megacorp in this space, but its contemporaries show similar gaming red flags on their balance sheets. AMD made just over $9 billion this past quarter , but $4.3 billion was from data center sales while only $1.3 billion came from gaming. That’s much better than last year —when data centers brought in $3.5 billion and gaming just $462 million—but data centers are still a far bigger portion of AMD’s revenue than gaming. These numbers make a compelling case for giving up some interest and investment in gaming hardware development. It doesn’t mean they’re going to stop make gaming GPUs entirely. (Or does it?) But if you’re Jensen Huang facing off against shareholders who are demanding the revenue numbers go up as much as possible as fast as possible, what are you going to sell them on: a new gaming GPU that has historically low margins, or a new generation of data center hardware to feed into the accelerating AI bubble with untold potential? You could even argue that Nvidia’s increasing focus over the past few years on DLSS and ray tracing over pure rasterization performance is an early sign of it putting its eggs in the data center basket. A canary in the RAM mines The biggest side effect of all these new data center builds hasn’t been GPU scarcity, surprisingly. (At least, not to the degree we saw during the cryptocurrency craze.) Rather, it’s skyrocketing memory prices . RAM kits have increased in price by over 200 percent in some cases, making large capacity kits more costly than top-tier GPUs. Some modest RAM options are even more expensive than gaming consoles. Consumer RAM is shooting up in price because all the major memory manufacturers are inundated with orders for data center memory , like HBM and LPDDR. Some have begun pivoting their fabrication lines to these higher-margin memory types, leading to shortages of NAND chips—and, consequently, shortages of consumer memory and SSDs . Nor Gal / Shutterstock.com Those shortages are making RAM and SSDs far more expensive. And yet, despite the increased margins and diminishing supply versus demand, Micron just closed its Crucial brand of consumer RAM and SSDs. It was profitable, it was popular, it had a distinct market niche that served consumers and gamers for decades. But even Micron didn’t see the point of keeping it going when it could instead make heaps more cash from selling Micron NAND chips and server memory. And if Micron is so willing to pull out of the consumer space due to AI-driven demand, how much more will Nvidia be tempted to do the same? What’s stopping Nvidia from reaching the same conclusion? For further proof of this future, Nvidia is rumored to be cutting its gaming GPU supply in 2026 due to memory shortages. It’s especially notable how Nvidia appears to be cutting the more affordable mid-range graphics cards first, leaving ultra-budget and ultra-high-end lines intact for now. Is this just the first step in Nvidia leaving gamers behind? Where things could go from here There are some intriguing comparisons to make between Nvidia and other big businesses that found growth and revenue in avenues that weren’t where they started. IBM went from being the name in computing hardware to one that largely runs in the background. It sold off its core hardware businesses and became a software and services company that’s still worth tens of billions of dollars. It recently spun off again, creating a separate company to handle IT services while the core business refocused on cloud computing and AI. Nvidia could do that: spin or sell off its gaming divisions and license its GPU technology to that spun-or-sold-off subsidiary. Notice the lack of graphics cards in this Nvidia promo image. Nvidia Perhaps Nvidia could even end up like Adobe. In the mid-2010s, the developer of Photoshop launched Creative Cloud and slowly pushed all its once-in-perpetuity software licenses into a subscription model that’s still going on today. Could that apply to Nvidia’s GeForce Now streaming service ? It had 25 million subscribers as of 2023 and ran on GPUs designed for data center server racks. Nvidia could leave dedicated desktop and laptop GPUs behind entirely and pivot its gaming divisions into software/hardware-as-a-service firms. If gaming goes a similar way to TV and movie streaming, it’s possible Nvidia could even pull a Netflix and slowly de-emphasize its DVD-like hardware business in favor of powering it all from the cloud. Gaming won’t die, but it will change As much as this article is heavy on the doom, Nvidia is unlikely to exit gaming entirely . People want to play games and there’s money to be made there, so someone will keep tapping that market. But how that revenue is extracted may change—dramatically so. Microsoft is already talking about making the next Xbox more of a PC/console hybrid . And with the latest Xbox consoles being the third wheel of this generation, it wouldn’t be a surprise to see the future of Xbox focus more on streaming games than buying/owning them. Xbox Game Pass already has over 37 million subscribers—that’s more than the number of Xbox Series X/S consoles sold this generation. Nvidia could do something similar. Or it could spin off. Or it could stop making gaming GPUs entirely. The only thing we know for sure is this: when a gaming company starts making astronomical amounts of money due to AI-driven demand, it’s hard to imagine it wouldn’t be tempted to dive head-first into an AI-first strategy at the expense of gaming. Further reading: PC vs. consoles? Gaming’s future is blurrier than ever