I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.
I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.
Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.
I'm not saying they all got together and decided this together but their wonks are probably all saying the same thing. The market is shrinking and whether it's by design or incompetence, this creates a new opportunity to acquire it wholesale for pennies on the dollar and build a wall around it and charge for entry. It's a natural result of games requiring NVidia developers for driver tuning, bitcoin/ai and buying out capacity to prevent competitors.
The wildcard I can't fit into this puzzle is Valve. They have a huge opportunity here but they also might be convinced that they have already saturated the market and will read the writing on the wall.
> This in turn sparked rumors about NVIDIA purposefully keeping stock low to make it look like the cards are in high demand to drive prices. And sure enough, on secondary markets, the cards go way above MSRP
Nvidia doesn't earn more money when cards are sold above MSRP, but they get almost all the hate for it. Why would they set themselves up for that?
Scalpers are a retail wide problem. Acting like Nvidia has the insight or ability to prevent them is just silly. People may not believe this, but retailers hate it as well and spend millions of dollars trying to combat it. They would have sold the product either way, but scalping results in the retailer's customers being mad and becoming some other company's customers, which are both major negatives.
5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.
Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.
Microsoft fails consistently ... even when offered a lead on the plate... it fails, but these failures are eventually corrected for by the momentum of its massive business units.
Apple is just very very late... but this failure can be eventually corrected for by its unbeatable astroturfing units.
Perhaps AMD are too small keep up everywhere it should. But compared to the rest, AMD is a fast follower. Why Intel is where it is is a mystery to me but i'm quite happy about its demise and failures :D
Being angry about NVIDIA is not giving enough credit to NVIDIA for being on-time and even leading the charge in the first place.
Everyone should remember that NVIDIA also leads into the markets that it dominates.
It's hard to get too offended by them shirking the consumer marker right now when they're printing money with their enterprise business.
For as long as they have competition, I will support those companies instead. If they all fail, I guess I will start one. My spite for them knows no limits
So gamers have to pay much more and wait much longer than before, which they resent.
Some youtubers make content that profit from the resentment so they play fast and loose with the fundamental reasons in order to make gamers even more resentful. Nvidia has "crazy prices" they say.
But they're clearly not crazy. 2000 dollar gpus appear in quantities of 50+ from time to time at stores here but they sell out in minutes. Lowering the prices would be crazy.
Liars or not, the performance has not been there for me in any of my usecases, from personal to professional.
A system from 2017/2018 with an 8700K and an 8GB 2080 performs so closely to the top end, expensive systems today that it makes almost no sense to upgrade at MSRP+markup unless your system is older than this.
Unless you need specific features only on more recent cards, there are very few use cases I can think of needing more than a 30 series card right now.
I hope they get hit with a class action lawsuit and are forced to recall and properly fix these products before anyone dies as a result of their shoddy engineering.
I found it super alarming because why would they fake something on stage to the extent of just lying.i know Steve jobs had backup phones but jsut claiming a robot is autonomous when it isn’t I just feel it was scammy.
It reminded me of when Tesla had remote controlled Optimus bots. I mean I think that’s awesome like super cool but clearly the users thought the robots were autonomous during that dinner party.
I have no idea why I seem to be the only person bothered by “stage lies” to this level. Tbh even the Tesla bots weren’t claimed to be autonomous so actually I should never have mentioned them but it explains the “not real” vibe.
Not meaning to disparage just explaining my perception as a European maybe it’s just me though!
EDIT > Im kinda suprised by the weak arguments in the replies, I love both companies, I am just offering POSITIVE feedback, that its important ( in my eyes ) to be careful not to pretend in certain specific ways or it makes the viewer question the foundation ( which we all know is SOLID and good ).
EDIT 2 >There actually is a good rebuttal in the replies, although apparently I have "reading comprehension skill deficiencies" its just my pov that they were insinuating the robot was aware of its surroundings, which is fair enough.
The failure rate is just barely acceptable in a consumer use-case with a single card, but with multiple cards the probability of failure (which takes down the whole machine, as there's no way to hot-swap the card) makes it unusable.
I can't otherwise see why they'd persevere on that stupid connector when better alternatives exist.
What's so special about NVENC that Vulkan video or VAAPI can't provide?
> AMD also has accelerated video transcoding tech but for some reason nobody seems to be willing to implement it into their products
OBS works with VAAPI fine. Looking forward to them adding Vulkan video as an option.
Either way, as a Linux gamer I haven't touched Nvidia in years. AMD is a way better experience.
This is wrong. The 50 series uses 12V-2x6, not 12VHPWR. The 30 series was the first to use 12VHPR. The 40 series was the second to use 12VHPWR and first to use 12V-2x6. The 50 series was the second to use 12V-2x6. The female connectors are what changed in 12V-2x6. The male connectors are identical between 12V-2x6 and 12VHPWR.
Nvidia's been at this way longer than 7 years. They were cheating at benchmarks to control a narrative back in 2003. https://tech.slashdot.org/story/03/05/23/1516220/futuremark-...
It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).
Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.
Deceptive marketing aside, it's true that it's sad that we can't get 4K 60 Hz with ray tracing with current hardware without some kind of AI denoising and upscaling, but ray tracing is really just _profoundly_ hard so I can't really blame anyone for not having figured out how to put it in a consumer pc yet. There's a reason why pixar movies need huge render farms that take lots of time per frame. We would probably sooner get gaussian splatting and real time diffusion models in games than nice full resolution ray tracing tbh.
If you want to hate on Nvidia, there'll be something for you in there.
An entire section on 12vhpwr connectors, with no mention of 12V-2x6.
A lot of "OMG Monopoly" and "why won't people buy AMD" without considering that maybe ... AMD cards are not considered by the general public to be as good _where it counts_. (Like benefit per Watt, aka heat.) Maybe it's all perception, but then AMD should work on that perception. If you want the cooler CPU/GPU, perception is that that's Intel/Nvidia. That's reason enough for me, and many others.
Availability isn't great, I'll admit that, if you don't want to settle for a 5060.
Open is good, but the open standard itself is not enough. You need some kind of testing/certification, which is built in to the G-Sync process. AMD does have a FreeSync certification program now which is good.
If you rely on just the standard, some manufacturers get really lazy. One of my screens technically supports FreeSync but I turned it off day one because it has a narrow range and flickers very badly.
* The prices for Nvidia GPUs are insane. For that money you can have an extremely good PC with a good non Nvidia GPU.
* The physical GPU sizes are massive, even letting the card rest on a horizontal motherboard looks like scary.
* Nvidia has still issues with melting cables? I've heard about those some years ago and thought it was a solved problem.
* Proprietary frameworks like CUDA and others are going to fall at some point, is just a matter of time.
Looks as if Nvidia at the moment is only looking at the AI market (which as a personal belief has to burst at some point) and simply does not care the non GPU AI market at all.
I remember many many years ago when I was a teenager and 3dfx was the dominant graphics card manufacturer that John Carmack profethically in a gaming computer magazine (the article was about Quake I) predicted that the future wasn't going to be 3dfx and Glide. Some years passed by and effectively 3dfx was gone.
Perhaps is just the beginning of the same story that happened with 3dfx. I think AMD and Intel have a huge opportunity to balance the market and bring Nvidia down, both in the AI and gaming space.
I have only heard excellent things about Intel's ARC GPUs in other HNs threads and if I need to build a new desktop PC from scratch there's no way to pay for the prices that Nvidia is pushing to the market, I'll definitely look at Intel or AMD.
Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.
I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.
NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.
Here's another nvdia/mellanox bs problem: many mlx nic cards are finalized or post assembled say by hp. So if you have a hp "mellanox" nic nvidia washes their hands of anything detailed. It's not ours; hp could have done anything to it what do we know? So one phones hp ... and they have no clue either because it's really not their IP or their drivers.
It's a total cluster bleep and more and more why corporate america sucks
Every line of the article convinces me I'm reading bad rage bait, every comment in the thread confirms it's working.
The article provides a nice list of grievances from the "optimized youtube channel tech expert" sphere ("doink" face and arrow in the thumbnail or GTFO), and none of them really stick. Except for the part where nVidia is clearly leaving money on the table... From 5080 up no one can compete, with or without "fake frames", at no price, I'd love to take the dividends on the sale of the top 3 cards, but that money is going to scalpers.
If nvidia is winning, it's because competitors and regulators are letting them.
I feel like this is a misunderstanding, though I admit I'm splitting hairs here. DLSS is a form of TAA, and so is FSR and most other modern upscalers. You generally don't need an extra antialiasing pipeline if you're getting an artificially supersampled image.
We've seen this technique variably developed across the lifespan of realtime raster graphics; first with checkerboard rendering, then TAA, then now DLSS/frame generation. It has upsides and downsides, and some TAA implementations were actually really good for the time.
He actually ended up buying older but somewhat similar used hardware with his personal money, to be able to do his work.
Not even sure if he was eventually able to expense it, but wouldn't be surprised if not, knowing how big companies bureaucracy works...
Otherwise the money is in the datacenter (AI/HPC) cards.
my most recent upgrade was for a 4090, but that gives me only 24GB VRAM, and it's too expensive to justify buying two of them. I also have an antique kepler datacenter GPU, but Nvidia cut driver support a long while ago, making software quite a pain to get sorted. there's a nonzero chance I will wind up importing a Moore Threads GPU for next purchase; Nvidia's just way too expensive, and I don't need blazing fast speeds given most of my workloads run well inside the time I'm sleeping, but I can't be running at the speed of CPU; I need everything to fit into VRAM. I'd alternately be stoked for Intel to cater to me. $1500, 48GB+ VRAM, good pytorch support; make it happen, somebody.
https://stockanalysis.com/stocks/nvda/metrics/revenue-by-seg...
I have a 4070 Ti right now. I use it for inference and VR gaming on a Pimax Crystal (2880x2880x2). In War Thunder I get ~60 FPS. I’d love to be able to upgrade to a card with at least 16GB of VRAM and better graphics performance… but as far as I can tell, such a card does not exist at any price.
Idiots doing hardware installation, with zero experience, using 3rd party cables incorrectly, posting to social media, and youtubers jumping on the trend for likes.
These are 99% user error issues drummed up by non-professionals (and, in some cases, people paid by 3rd party vendors to protect those vendors' reputation).
And the complaints about transient performances issues with drivers, drummed up into apocalyptics scenarios, again, by youtubers, who are putting this stuff under a microscope for views, are universal across every single hardware and software product. Everything.
Claiming "DLSS is snakeoil", and similar things are just an expression of the complete lack of understanding of the people involved in these pot-stirring contests. Like... the technique obviously couldn't magically multiply the ability of hardware to generate frames using the primary method. It is exactly as advertised. It uses machine learning to approximate it. And it's some fantastic technology, that is now ubiquitous across the industry. Support and quality will increase over time, just like every _quality_ hardware product does during its early lifespan.
It's all so stupid and rooted in greed by those seeking ad-money, and those lacking in basic sense or experience in what they're talking about and doing. Embarrassing for the author to so publicly admit to eating up social media whinging.
Each year those performance margins seem to narrow. I paid $1000+ dollars for my RTX 4080 Super. That’s ridiculous. No video card should cost over $1000. So the next time I “upgrade,” it won’t be NVIDIA. I’ll probably go back to AMD or Intel.
I would love to see Intel continue to develop video cards that are high performance and affordable. There is a huge market for those unicorns. AMDs model seems to be slightly less performance for slightly less money. Intel on the other hand is offering performance on par with AMD and sometimes NVIDIA for far less money - a winning formula.
NVIDIA got too greedy. They overplayed their hand. Time for Intel to focus on development and fill the gaping void of price for performance metrics.
This isn't true. People were buying brackets with 10 series cards.
that's like they purposely not selling because they allocated 80% of their production to enterprise only
I just hope that new fabs operate early as possible because these price is insane
Of course the fact that we overwhelmingly chose the better option means that… we are worse off or something?
It became obvious when old e-waste Xeons were turned into viable, usable machines, years ago.
Something is obviously wrong with this entire industry, and I cannot wait for it to pop. THIS will be the excitement everyone is looking for.
Not for me. I prefer Intel offerings. Open and Linux friendly.
I even hope they would release the next gen Risc-V boards with Intel Graphics.
And you can build mythologies around falsehoods to further reinforce it: "I have a legal obligation to maximize shareholder value." No buddy, you have some very specific restrictions on your ability to sell the company to your cousin (ha!) for a handful of glass beads. You have a legal obligation to bin your wafers the way it says on your own box, but that doesn't seem to bother you.
These days I get a machine like the excellent ASUS Proart P16 (grab one of those before they're all gone if you can) with a little 4060 or 4070 in it that can boot up Pytorch and make sure the model will run forwards and backwards at a contrived size, and then go rent a GB200 or whatever from Latitude or someone (seriously check out Latitude, they're great), or maybe one of those wildly competitive L40 series fly machines (fly whips the llama's ass like nothing since Winamp, check them out too). The GMTek EVO-X1 is a pretty capable little ROCm inference machine for under 1000, its big brother is nipping at the heels of a DGX Spark under 2k. There is good stuff out there but its all from non-incumbent angles.
I don't game anymore but if I did I would be paying a lot of attention to ARC, I've heard great things.
Fuck the cloud and their ancient Xeon SKUs for more than Latitude charges for 5Ghz EPYC. Fuck NVIDIA gaming retail rat race, its an electrical as well as moral hazard in 2025.
It's a shame we all have to be tricky to get what used to be a halfway fair deal 5-10 years ago (and 20 years ago they passed a HUGE part of the scaling bonanza down to the consumer), but its possible to compute well in 2025.
The spoiled gamer mentality is getting old for those of us that actually work daily in GPGPU across industries, develop with RTX kit, do AI research, etc.
Yes they’ve had some marketing and technical flubs as any giant publically traded company will have, but their balance of research-driven development alongside corporate profit necessities is unmatched.
Whoa, the stuff covered in the rest of the post is just as egregious. Wow! Maybe time to figure out which AMD models compares performance-wise and sell this thing, jeez.
Customers don’t matter, the company matters.
Competition sorts out such attitude quick smart but AMD never misses a chance to copy Nvidias strategy in any way and intel is well behind.
So for now, you’ll eat what Jensen feeds you.
If all game developers begin to rely on NVIDIA technology, the industry as a whole puts customers in a position where they are forced to give in
The public's perception of RTX's softwarization (DLSS) and them coining the technical terms says it all
They have a long term plan, and that plan is:
- make all the money possible
- destroy all competition
- vendor lock the whole world
When I see that, I can't help myself but to think something is fishy:
In other news I hope intel pulls their thumb out of their ass cause AMD is crushing it and that’s gonna end the same way
you are safe.
I guess the author is too young and didn't go through iPhone 2G to iPhone 6 era. Also worth remembering it wasn't too long ago Nvidia was sitting on nearly ONE full year of GPU stock unsold. That has completely changed the course of how Nvidia does supply chain management and forecast. Which unfortunately have a negative impact all the way to Series 50. I believe they have since changed and next Gen should be better prepared. But you can only do so much when AI demand is seemingly unlimited.
>The PC, as gaming platform, has long been held in high regards for its backwards compatibility. With the RTX 50 series, NVIDIA broke that going forward. PhysX.....
Glide? What about all the Audio Drivers API before. As much as I wish everything is backward compatible. That is just not how the world works. Just like any old games you need some fiddling to get it work. And they even make the code available so people could actually do something rather then emulation or reverse engineering.
>That, to me, was a warning sign that maybe, just maybe, ray tracing was introduced prematurely and half-baked.
Unfortunately that is not how it works. Do we want to go back to Pre-3DFx to today to see how many what we thought was great idea for 3D accelerator only to be replaced by better ideas or implementation? These idea were good on paper but didn't work well. We than learn from it and reiterate.
>Now they’re doing an even more computationally expensive version of ray tracing: path tracing. So all the generational improvements we could’ve had are nullified again......
How about Path Tracing is simply a better technology? Game developers also dont have to use any of these tech. The article act as if Nvidia forces all game to use it. Gamers want better graphics quality, Artist and Graphics asset is already by far the most expensive item in gaming and it is still increasing. What hardware improvement is allowing those to be achieved at lower cost. ( To Game Developers )
>Never mind that frame generation introduces input lag that NVIDIA needs to counter-balance with their “Reflex” technology,
No. That is not why "Reflex" tech was invented. Nvidia spend R&D on 1000 fps monitor as well and potentially sub 1ms frame monitor. They have always been latency sensitive.
------------------------------
I have no idea how modern Gamers become what they are today. And this isn't the first time I have read it even on HN. You dont have to buy Nvidia. You have AMD and now Intel ( again ). Basically I can summarise one thing about it, Gamers want Nvidia 's best GPU for the lowest price possible. Or a price they think is acceptable without understanding the market dynamics and anything supply chain or manufacturing. They also want higher "generational" performance. Like 2x every 2 year. And if they dont get it, it is Nvidia's fault. Not TSMC, not Cadence, not Tokyo Electron, not Issac Newton or Law of Physic. But Nvidia.
Nvidia's PR tactic isn't exactly new in the industry. Every single brand do something similar. Do I like it? No. But unfortunately that is how the game is played. And Apple is by far the worst offender.
I do sympathise with the Cable issue though. And not the first time Nvidia has with thermal issues. But then again they are also the one who are constantly pushing the boundary forward. And AFAIK the issues isn't as bad as the series 40 but some YouTube seems to be making a bigger issue than most. Supply issues will be better but TSMC 3nm is fully booked . The only possible solution would be to have consumer GPU less capable of AI workload. Or to have AI GPU working with leading edge node and consumer always be a node lower to split the capacity problem. I would imagine that is part of the reason why TSMC is accelerating 3nm capacity increase on US soil. Nvidia is now also large enough and has enough cash to take on more risk.
The lack of open source anything for GPU programming makes me want to throw my hands up and just do Apple. It feels much more open than pretending that there's anything open about CUDA on Linux.
I honestly don't know why nvidia didn't just suspend their consumer line entirely. It's clearly no longer a significant revenue source and they have thoroughly destroyed consumer goodwill over the past 5 years.
- 12VHPWR is not at fault / the issue. As the article itself points out, the missing power balancing circuit is to blame. The 3090 Ti had bot 12VHPWR and the balancing power circuit and ran flawless.
- Nvidia G-Sync: Total non-issue. G-Sync native is dead. Since 2023, ~1000 Freesync Monitors have been released, and 3(!!) G-Sync native Monitors.
- The RTX 4000 series is not still expensive, it is again expensive. It was much cheaper a year before RTX 5000 release
- Anti-Sag Brackets were a thing way before RTX 4000
> So 7 years into ray traced real-time computer graphics and we’re still nowhere near 4K gaming at 60 FPS, even at $1,999.
The guy is complaining that a product can’t live up to his standard, while dismissing barely noticeable proposed trade off that can make it possible because it’s «fake».
How could Nvidia realistically stop scalper bots?
Apparently AWS has them available in the P6 instance type, but the only configuration they offer has 2TB of memory and costs... $113/hr [2]? Like, what is going on at Nvidia?
Where the heck is Project Digits? Like, I'm developing this shadow opinion that Nvidia actually hasn't built anything new in three years, but they fill the void by talking about hypothetical newtech that no one can actually buy + things their customers have built with the actually good stuff they built three years ago. Like, consumers can never buy Blackwell because "oh Enterprises have bought them all up" then when Microsoft tries to buy any they say "Amazon bought them all up" and vice-versa. Something really fishy is going on over there. Time to short.