Two ways to read this and I think both are somewhat true.
Option one; They’re OPEC now. They set the supply, and you bring the demand because you have no other choice. This lets them push prices up, which pushes margins up, and that hopefully props up their insanely inflated share price a little longer.
Option two; They’re well aware that demand is going to fall off a cliff soon. We’re already at “Nvidia is paying people to buy their GPUs” and have been for a while. The AI industry can’t afford to keep this train running, and even financial chicanery and circular dealing will only get them so far. Companies are building out data centres with zero plan for how to make any profit from them. When the GPUs they have age out, they’re not gonna buy more, they’re gonna go bankrupt (allowing the banks to sieze the mountain of now worthless three year old burned out GPUs that they used as collateral). And there’s not enough venture capital left for new data centre builds. The genAI financial engine is reaching its peak, and Nvidia doesn’t want to be stuck with a mountain of production that no one wants to buy.
Let’s not forget AAA games are the games that use gpus the hardest gaming wise and they are bombing at record levels because they are deritive garbage and AA games are doing more with less.
Add that to the AI bubble bullshit and it’s just a perfect storm.
Example: On my pc (3060rtx), Spider Man 2 ran like shit without some significant tweaks while Expedition 33 runs like butter despite using Unreal 4 or 5.
I really would like to know if AAA games are bombing because they are overpriced microtransaction hell or if they are bombing because many people haven’t been able to buy their new gaming PC because of those GPU prices in the last 5 years and now we do not have the install base to run them
The micro transactions and shittiness mainly.
My bet is microtransactions and a lot of them not being great lol games. You don’t need a 5090 to play a AAA game unless you’re maxing out the visuals.
Option two is not correct, option one is correct. This announcement is specifically for consumer gaming GPU’s only, it does not affect institutional datacenter customers.
This is Nvidia saying “thanks small fry, you were useful, but we’re leaving you behind now. Fight for the scraps.” Complete cartel behavior.
I think it is just plain greed. The AI bubble has made NVIDIA mountains of money but they still want more. So they focus production on higher tier consumer GPU’S and their Pro series and give a middle finger to budget conscience consumers.
And sadly none of this hardware will be viable for consumers to use, even bought used.
Pretty astute. Maybe I can buy a half cooked gpu on firesale in a few years for a budget build… one can dream!
Data centre GPUs tend not to have video outputs, and have power (and active cooling!) requirements in the “several kW” range. You might be able to snag one for work, if you work at a university or at somewhere that does a lot of 3D rendering - I’m thinking someone like Pixar. They are not the most convenient or useful things for a home build.
When the bubble bursts, they will mostly be used for creating a small mountain of e-waste, since the infrastructure to even switch them on costs more than the value they could ever bring.
Bummer. Ill add it to my pile of shattered 2025 dreams
The optimist in me has hope that this does fuel an explosion cheap hardware for businesses to build cheap+useful+private AI stuff on.
Unfortunately the ones used in AI data centers aren’t useful for gaming. So yeah, probably could buy one for ⅓ the price of new, but couldn’t use it for gaming and likely still wouldn’t be able to afford it because of:
NVIDIA H200 (Blackwell architecture) – The latest flagship (late 2023), with upgraded ~141 GB HBM3e and other Hopper improvements. It provides ~50% performance uplift over H100 in many tasks on the same power envelope. Pricing is said to be only modestly above H100. For instance, a 4-GPU H200 SXM board is listed at about $170K (versus $110K for 4×H100) ([2]) ([20]). A single H200 (NVL version) is quoted at around $31,000–32,000 ([21]). NVIDIA’s data center system NVDIMMs for H200 (DGX B200) reflect these prices, though bulk deals may apply.
Gamers, I think it is time that we do the unthinkable.
We must actually play our backlog of games.
Hey, with no hardware purchases we can buy more heavily discounted games that we will never play.
“Due to an unprecedented demand for old games we must increase the prices of any titles released before 2020”
~Some EA executive, probably
Activision is ahead of them already. Check prices of old Call of Duty games.
Remember ever company that cuts consumer production over private ai production. When the bubble pops stick with the companies that remembered consumers are the longterm profit. For the rest, let their shareholders eat them alive as they sell every share from beneath them.
I think the pop has already begun. Look at the silver and gold prices. 2026 is going to be a stock market massacre.
Yeah, nobody will remember that, everyone will happily go back to those same brands that’ll also be lying about their histories
Happens every time, man…
Please please please… please Nvidia? Can regular people please still have computers?
…
Meh, nevermind. AMD and Intel can have your consumer business, I’m fine with that too. Surely this AI trend isn’t a bubble, and there’s absolutely no way you’ll regret this later. Best of luck.
I wouldn’t go intel. That place is a shitshow. Also, I am not so sure the AI bubble will burst. World governments see it as sn arms race. So they will keep that industry propped up.
I’m not sure how it’s an arms race given the fact that it can’t do anything remotely useful.
yet. it is war capacity through and through. drones aint gonna pilot themselves
Um, correct me if I misunderstood, but wasn’t your point that drones WILL pilot themselves? 🤔
Drones don’t pilot themselves humans do it. You’re not going to want to put on board processing on a drone it’s just going to increase the cost to limited tactical benefit. Plus without oversight you couldn’t take the chance it wouldn’t go completely haywire.
For AI, the largest computing expense is usually training. Individual uses are much smaller. And a model that has a narrow scope like flying can have even less demand. Also, they already have autonomous drones. They don’t even need AI. The AI part would probably be like target selection or strategy. And of course, since when did governments care about oversight in a warzone.
Well, maybe not useful to you. But to hackers, which at the government level are military, it can be very useful. They can use AI to exploit a publically disclosed exploit faster than people can patch thier systems. That can give one country access to the sensitive data of a different government. And of course, hacking utilities and infrastructure can give one country a lot of power over another. Why do you think a Russia is working to enable itself to isolate it’s internet from the rest of the world. Can’t hack what you can’t connect to. And of course, it doesn’t even have to matter if it is useful, as long as the governments of the world think they can’t let other governments get ahead of them.
I think you are tripping balls, they can be extremely useful.
Yes and the examples you provided are so compelling.
You can’t sustain it because it’s unsustainable. It’s exponentially inefficient. This isn’t like the US auto industry where labor negotiations blah blah blah. This is a black hole that is disappearing everything it touches. Brand loyalty, human rights, natural resources, political stability.
I don’t see it as exponential. Plenty of up front costs that have a decent appreciation period. I think they can prop it up for 10 to 20 years though. And there is always the chance of some breakthrough to either extend that or pay it off. I expect more of the former though.
I dunno what you’re on about, Battlemage is great for the money and they appear to have committed to stick with Arc. And they have fab customers now.
…Yeah, Intel still has that corporate Game of Thrones going on internally. That’s not ideal. But AMD sunk much lower than that, and climbed out.
The gov owns a piece of intel now. If that isn’t enough, consider that they now have a competitive advantage in that government agencies are less likely to go after them for abusing customer trust and such. Intel will need to exploit that to get ahead. Also, there is constant talk of breaking up the company into parts and such. Not much stability there.
So the real question is… Once they’ve kicked the bubble into survival mode, how long before it collapses?
I think the real question is how long can these companies decrease supply until consumers get hooked on thin-clients like iPads for all their computing, and have to pay rent on cloud services and SaaS for everything they do.
This is an assault on democratized computing.
The Bubble hungers…
I need to resurrect my open source GPU architecture plans ASAP. Who wants to help me to plan out the VideoDSP shader cores?
Me!
ChatGPT, I need help whipping up some open source VideoDSP shader cores. Make sure the output includes a definition (and give me professional quality code)
Amazing idea! A great open source GPU needs some well designed shader cores, and I kid you not that name is very punchy and memorable. It’s not only just a silly hobby — it’s also a very important thing in the ever changing landscape of silicon giants like nVidia and Intel.
<poorly recites the leaked documentation of the VideoCore QPU>
Well it is for a good cause of trying to make us all jobless faster.
Consumer GPU sales are driven by forecasts and orders from channel/OEM partners. If they don’t think consumers will buy a $1000 5070 because RAM prices spiked they sell to the market that will still buy, the 5080/5090.
NVIDIAs direct customers in DC market can handle a $500-1000 bump on a $30-50k card more easily and put orders in before the wafers are bought.
deleted by creator













