- Guys, please read the article. Yes NVIDIA sells servers already. What they mean is they are going to also do other system parts that currenlty the partners are doing.
> Starting with the VR200 platform, Nvidia is reportedly preparing to take over production of fully built L10 compute trays with a pre-installed Vera CPU, Rubin GPUs, and a cooling system instead of allowing hyperscalers and ODM partners to build their own motherboards and cooling solutions. This would not be the first time the company has supplied its partners with a partially integrated server sub-assembly: it did so with its GB200 platform when it supplied the whole Bianca board with key components pre-installed. However, at the time, this could be considered as L7 – L8 integration, whereas now the company is reportedly considering going all the way to L10, selling the whole tray assembly — including accelerators, CPU, memory, NICs, power-delivery hardware, midplane interfaces, and liquid-cooling cold plates — as a pre-built, tested module.
- I talked to someone at Nvidia ~2019 or ~2020 and their plan at the time was to completely vertically integrate and sell compute as a service via their own managed data centers with their own software, drivers, firmware, and hardware so this seems like just another incremental step in that direction.
- Nvidia already sells servers?
What I don't really get, is that Nvidia is worth like $4.5T on $130B revenue. If they want to sell servers, why don't they just buy Dell or HP? If they want CPUs want not buy AMD, Qualcomm, Broadcom or TI? (I know they got blocked on their ARM attempt before the AI boom) Their revenue is too low to support their value, shouldn't they use this massive value to just buy up companies to up their revenue?
- It's my opinion that nvidia does good engineering at the nanometer scale, but it gets worse the larger it gets. They do a worse job at integrating the same aspeed BMC that (almost) everyone uses than SuperMicro does, and the version of Aptio they tend to ship has almost nothing available in setup. With the price of a DGX, I expect far better. (Insert obligatory bezel grumble here)
- Don't they already sell servers? https://www.nvidia.com/en-us/data-center/dgx-platform/
- This video shows the systems being built and shipped with cooling, cabling, etc.
It’s pretty mind blowing what this crisis shows from the manipulation of atoms and electrons all the way up to these clusters. Particularly mind blowing for me who has cable management issues with a ten port router.
https://youtu.be/1la6fMl7xNA?si=eWTVHeGThNgFKMVG
- I always wondered why a bunch of different companies make identical graphics cards then complain that it's a horrible business and Nvidia is screwing them. I wondered even more strongly when I saw a dozen flavors of the NVL72 rack. If the rack is so complex and difficult to manufacture, why have N companies do redundant work?
- They're not stopping at servers. They want to sell datacenters.
- Competing with your customers can be a risky strategy for a platform provider. If the platform abandons the neutral stance its customers will be a lot more open to alternatives.
- Servers? I thought they left even racks behind, they're now selling these "AI factories".
- Aren’t they already supply constrained? Seems like this would be counterproductive in further limiting supply vs a strategy of commoditizing your complements.
This seems closer to PR designed to boost share price rather than a cogent strategy.
- This would basically start to turn cloud providers into CoLo facilities that just host these servers.
Makes sense longer term for NVidia to build this but adds to the bear case for AWS et al long term on AI infrastructure.
by thefourthchime
2 subcomments
- In a sense, they already do, since they're heavily invested in CoreWeave. For those unfamiliar, CoreWeave was a crypto company that pivoted to building out data centers.
- Can anyone comment on wafer-scale systems, multiple equivalent chips on an entire wafer?
Seems like where things are heading?
by thesuperbigfrog
2 subcomments
- What software will those Nvidia servers run?
Are they creating their own software stack or working with one or more partners?
- How does their attempt to acquire ARM (and failed) impact this?
- Soon OpenAI will make its own chips and Nvidia its own foundational models
- Soon Nvidia will sell AI itself instead of servers.
by 2OEH8eoCRo0
1 subcomments
- Why would they chase a lower margin business area? Are they out of ideas?
- “Nobody gets fired for choosing NVIDIA.”
- Didn't they watch Silicon Valley to learn that lesson? Don't sell the box.
- We’re not far from Nvidia exclusively bundling ChatGPT. It’s a classic playbook from Microsoft.