by TehCorwiz
6 subcomments
- If they do it'll likely be part of an industry wide push to kill off the home-built PC market. It's no secret that MS and others want the kind of ecosystem Apple has and governments want more backdoor access to tech. And which mfg wouldn't want to eliminate partial upgrades/repairs. Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab. Now, since it's impractical to build your own they can raise the price to purchase one above reach of most people and the PC market succeeds in their rental PC aspirations.
by sombragris
11 subcomments
- I doubt that this would ever happen. But...
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
by fxtentacle
5 subcomments
- I don’t think they can.
NVIDIA, like everyone else on a bleeding edge node, has hardware defects. The chance goes up massively with large chips like modern GPUs. So you try to produce B200 cores but some compute units are faulty. You fuse them off and now the chip is a GP102 gaming GPU.
The gaming market allows NVIDIA to still sell partially defective chips. There’s no reason to stop doing that. It would only reduce revenue without reducing costs.
- AMD will be very happy when they do. They are already making great cards, currently running an RX7800XT (or something like that), and it's amazing. Linux support is great too
by ryandrake
2 subcomments
- It would be great if more GPU competition would enter the field instead of less. The current duopoly is pretty boring and stagnant, with prices high and each company sorta-kinda doing the same thing and milking their market.
I'm kind of nostalgic for the Golden Age of graphics chip manufacturers 25 years ago, where we still had NVIDIA and ATI, but also 3DFX, S3, Matrox, PowerVR, and even smaller players, all doing their own thing and there were so many options.
by resfirestar
4 subcomments
- This is just DRAM hysteria spiraling out to other kinds of hardware, will age like fine milk just like the rest of the "gaming PC market will never be the same" stuff. Nvidia has Amazon, Google, and others trying to compete with them in the data center. No one is seriously trying to beat their gaming chips. Wouldn't make any sense to give it up.
- It wouldn't be unheard of.
Qualcomm before they made all the chips they do today, ran a pretty popular and successful email client called Eudora.
Doing one thing well can lead to doing bigger things well.
More realistically, if the top end chips go towards the most demanding work, there might be more than enough lower grade silicon that can easily keep the gaming world going.
Plus, gamers rarely stop thinking in terms of gaming, and those insights helped develop GPUs into what they are today, and may have some more light to shine in the future. Where we see gaming and AI coming together, whether it's in completely and actually immersive worlds, etc, is pretty interesting.
Update: Adding https://en.wikipedia.org/wiki/Eudora_(email_client)
by whatshisface
0 subcomment
- If a $100B/year company closes a $1B/year division, they are doing something much worse than losing $1B/year: they are giving $1B in funding to a new competitor which can grow to threaten the major part.
by cycomanic
3 subcomments
- What I don't really understand is why the big data centre operators destroy their old cards instead of selling them off. What are the downsides for them? Apart from the obvious, i.e. it would bring in money, would it not also drive down the cost for brand new cards? I.e. Nvidia can currently overcharge dramatically because there is such a shortage. If the data centre operators would dump large numbers of used cards on the market would that not increase supply and drive down cost?
- I've switched away from Nvidia around 2008 due to poor Linux support. Been on AMD ever since (now running the flagship model from a year ago, the 7900xtx or whatever it's called).
Won't personally miss Nvidia, but we need competition in the space to keep prices 'reasonable' (although they haven't been reasonable for some years), and to push for further innovation.
- It means people get to enjoy more indie games with good designs, instead of having FOMO for cool graphics without substance.
- It remains to be seen to be fair.
But if this does happen it will be in my opinion the start of a slow death of the democratization of tech.
At best it means we're going to be relegated to last tech if even that, as this isn't a case of SAS vs s-ata or u.2 vs m.2, but the very raw tech (chips).
by asdaqopqkq
0 subcomment
- Will leave a vaccum for Chinese companies to grab the whole market share.
- I don't think they will. There is a reason why every GPU they make (gaming or not) supports CUDA. Future gamers are future CUDA developers. Taking that away would be a self goal.
- I've heard good things about Moore Threads. Who knows, maybe the consumer GPU market is not a duopoly after all, Nvidia exiting the market would be a good thing longer term by introducing more competitions.
My general impression is that the US technology companies either treat competition from China seriously and actively engage, or Chinese tech companies will slowly and surely eat the cake.
There are numerous examples: the recent bankruptcy of iRobot, the 3D printer market dominated by Bambu Labs, the mini PC market where Chinese brands dominates.
- If NVIDIA exits the market, there is still AMD, Intel and PowerVR (Imagination Technologies is back at making discrete PC GPUs, although currently only in China).
- Is there any path for Microsoft and NVIDIA to work together and resurrect some sort of transparent SLI layer for consumer workloads? It’d take the pressure off the high end of the market a little and also help old cards hold value for longer, which would be a boon if, for example, your entire economy happened to be balanced on top of a series of risky loans against that hardware.
- I don’t understand why most people in this thread think that this would be such a big deal. It will not change the market in significant negative or positive ways. AMD has been at their heals for a couple of decades and is more competitive than ever, they will simply fill their shoes. Most games consoles have been AMD centric for a long time regardless, they’ve always been fairly dominant in the mid range and they have a longstanding reputation of having the best price/performance value for gamers.
Overall, I think that AMD is more focused and energetic than their competitors now. They are very close to taking over Intel on their long CPU race, both in the datacenter and consumer segments, and Nvidia might be next in the coming 5 years, depending on how the AI bubble develops.
by butterknife
1 subcomments
- Is it better to go short on them or buy AMD?
by venturecruelty
1 subcomments
- Some of the pro-monopoly takes in this thread are mindblowing. We get precisely what we deserve.
- Look.
Most of the consumer market computes through their smartphones. The PC is a niche market now, and PC enthusiasts/gamers are a niche of a niche.
Any manufacturing capacity which NVIDIA or Micron devote to niche markets is capacity they can't use serving their most profitable market: enterprises and especially AI companies.
PCs are becoming terminals to cloud services, much like smartphones already are. Gaming PCs might still be a thing, but they'll be soldered together unexpandable black boxes. You want to run the latest games that go beyond your PC's meager capacity? Cloud stream them.
I know, I know. "Nothing is inevitable." But let's be real: one thing I've learned is that angry nerds can't change shit. Not when there's billions or trillions of dollars riding on the other side.
by webdevver
1 subcomments
- the pc gaming market is a hobbyist niche compared to the ongoing infrastructure projects.
i predict that the "pc" is going to be slowly but surely eaten bottom-up by increasingly powerful SoCs.
by wewewedxfgdf
0 subcomment
- AMD would do the same thing as Nvidia but $50 cheaper.
- Then Intel and AMD take what NVIDIA won’t.
by AtlasBarfed
0 subcomment
- Game graphics are still a high margin silicon business. Someone will do it.
Frankly, the graphics chops are plenty strong for a decade of excellent games. The big push in the next couple decades will probably be AI generated content to make games bigger and more detailed and more immersive
by darubedarob
0 subcomment
- If the ai features at least where not degrading visuals.
by Wowfunhappy
2 subcomments
- I'm also curious what this could mean for Nintendo.
- Then Intel and AMD carry on, tbh having sewn up handhelds and consoles and made gaming on integrated graphics mainstream many won't notice.
An AI bubble burst leaving loads of GPU laden datacenters is much more likely to hasten cloud gaming.
by throwaway613745
1 subcomments
- I’ve always been an AMD customer because I’ve despised Nvidia’s business practices for 10+ years.
It would still suck if they left the market because who does AMD have to compete with with? Intel? LOL
Increased prices for everyone. Lovely. I can’t despise AI enough.
- We can really hope they do it and fast!
That way they will not only burn the most good will but will also get themselves entangled even more into the AI bubble - hopefully enough to go down with it.
- Looks like an hit piece to trigger some people to dump their $NVDA stock. They worked phrases like "abandon" and "AI Bubble" into the title/subtitle. Authors other articles look like clickbait crap https://www.pcworld.com/author/jon-martindale
- More children born?
- If the AI bubble doesn’t burst is carrying an awful lot of water there…
by partomniscient
0 subcomment
- They probably won't. They'll just change things so their hardware becomes a subscription-style model rather than proper outright ownership by the purchaser, which is to a limited degree the case when it comes to their hardware drivers anyway.
Fuck this future.
- Shrug and buy the next best thing?
- [flagged]
by SkyMarshal
2 subcomments
- I'm not sure it would matter. It doesn't seem that graphics are the limiting factor in games anymore. Plenty of popular games use variations on cartoon-style graphics, for example - Fortnight, Overwatch, Valorant, etc. Seems gameplay, creativity, and player community are more determining factors.
That said, things like improved environmental physics and NPC/enemy AI might enable new and novel game mechanics and creative game design. But that can come from AMD and others too.