- Ghibli studio style graphics,
- the infamous em-dashes and bullet points
- customer service (just try to use Klarnas "support" these days...)
- Oracle share price ;) - imagine being one of the worlds most solid and unassailable tech companies, losing to your CEOs crazy commitment to the LLMs...
- The internet content - We now tripple check every Internet source we dont know to the core ...
- And now also the chips ?
Where does it stop? When we decide to drop all technology as it is?
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
Allocating a very large share of advanced memory production, especially HBM and high-end DRAM, which are critical for almost all modern technology (and even many non-tech products like household appliances) to a small number of U.S. centric AI players risks distorting the global market and limiting availability for other industries.
Even within Samsung itself, the Mobile eXperience (MX) Business (smartphones) is not guaranteed preferential access to memory from Samsung’s Device Solutions (DS) Division, which includes the Memory Business. If internal customers are forced to source DRAM elsewhere due to pricing or capacity constraints, this could eventually become economically problematic for a country that relies very heavily on semiconductor and technology exports.
A medium end gaming PC can display impressively realistic graphics at high resolutions and framerates while also being useful for a variety of other computationally intensive processing tasks like video encoding, compiling large code bases, etc. Or it can be used to host deeply mediocre local LLMs.
The actual frontier models from companies like Anthropic or OpenAI require vastly more expensive computing resources, resources that could otherwise be used for potentially more useful computation that isn't so inefficient. Think of all the computing power going into frontier models but applied to weather forecasting or cancer research or whatever.
Of course it's not either or, but as this article and similar ones point out, chips and other computing resources aren't infinite and AI for now at least has a seemingly insatiable appetite and enough dollars to starve other uses.
Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios
Are snapdragon chips the same way?
There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.
Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.
I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.
I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.
- Right now A LOT of PCs are getting out of date because Windows 11 wants some new hardware that's missing in older PCs. - At the same time, smartphones were starting to use and need more memory. - Modern cars (mostly electric) have much more electronics inside than older ones... they're now distributed systems with several CPUs working together along a bus. - The cloud needs to upgrade and newer servers have much more memory than older ones with almost the same foorprint (wich means you need to lease less datacenter space). - And GPUs for AI are in demand and need RAM.
But only AI is to blame although we're living in a perfect storm for RAM demand.
I know this is not always true, but on this case, crucial folks say the margins for end user are too low and they have demand for AI.
I suppose they do not intend to bring a new AI focused unit because it is not worth it or they believe the hype might be gone before it they are done. But what intrigues me is why they would allow other competitors to step up in a segment they dominate? They could raise the prices for the consumers if they are not worried about competition...
There is a whole "not-exactly" ai industry labeled as AI that received a capital t of money. Is that what they are going for?
Can this not be a opportunity for new entrants to start serving the other market segments?
How hard is it to start and manufacture memory for embedded systems in cars, or pc?
Prices are already through the roof...
https://www.tomsguide.com/news/live/ram-price-crisis-updates
next stage is paving everything with solar panels.
AI companies are spending billions of dollars constructing data centers at warp speed around the world. It's the reason why Gogia says the demand for these chips isn't just a cyclical blip.
That makes no sense, except in the very short term.The Datacentre building going on is clearly cyclic, the start of a cycle, but still a cycle. There are finite requirements, finite money and finite tolerance for loss.
RAM lead in times to ramp up production are long, but also finite.
This will correct, again. Hopefully in the meantime we learn to do more with less, always an innovation engine
Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.
It's insane.
Maybe the market pricing people out is accidentally doing what regulation couldn't? Concentrating AI where there's at least some oversight and accountability. Not sure if that's good or bad to be honest.
So writing optimized software was niche and overshadowed by the huge gains of hardware. People and corporations didn't care. They preferred fast feature delivery. Even when optimizing techniques like multi-tenant servers, we ended up having heavy containers wasting RAM and resources. And most apps switched to web frameworks like Electron where each one has its own huge web browser. Most people didn't care.
I hope this shortage has a silver lining of untangling the mess and refactoring the sea of bloat. But I know it's more likely some trick will be found to just patch it up a bit and only reduce the pain up to the new equilibrium level. For example something idiotic like Electron sharing resources across multiple apps and taking huge security risks. Corporations love to play false dichotomies to save pennies.
AI embodies everything wrong with our modern gilded age era of capitalism.
A 16 GiB M4 Mac Mini is $400 right now. That covers any essential use-case which means this is mostly hitting hobbyists or niche users.
Maybe this is the free market working as intended -- did you know that RAM is actually a luxury item, like a Rolls Royce, and most plebes should just make do with 4gb machines, because that is the optimum solution!
Whether you like it or not, AI right now is mostly
- high electricity prices - crazy computer part prices - phasing out of a lot of formerly high paying jobs
and the benefits are mostly - slop and chatgpt
Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.
The goal seems to be to squash the proliferation of open source LLMs and prevent individuals from running private, uncensored models at home. It is an effective way to kill any possible underdog or startup competition by making the "barrier to entry" (the compute) a rented privilege rather than a private resource. The partnership with Palantir seems to point directly to this, especially considering the ideologies of Thiel and Karp.
They are building a world where intelligence is centralized and monitored, and local sovereignty over AI is treated as a liability to be phased out
Presumably the boom times are the main reason why investment goes into it so that years later, consumers can buy for cheap.