> Some customers already have datasets on the order of a petabyte, or 2^50 bytes. Thus the 64-bit capacity limit of 2^64 bytes is only 14 doublings away. Moore's Law for storage predicts that capacity will continue to double every 9-12 months, which means we'll start to hit the 64-bit limit in about a decade. Storage systems tend to live for several decades, so it would be foolish to create a new one without anticipating the needs that will surely arise within its projected lifetime.
* https://web.archive.org/web/20061112032835/http://blogs.sun....
And some math on what that means 'physically':
> Although we'd all like Moore's Law to continue forever, quantum mechanics imposes some fundamental limits on the computation rate and information capacity of any physical device. In particular, it has been shown that 1 kilogram of matter confined to 1 liter of space can perform at most 10^51 operations per second on at most 10^31 bits of information [see Seth Lloyd, "Ultimate physical limits to computation." Nature 406, 1047-1054 (2000)]. A fully-populated 128-bit storage pool would contain 2^128 blocks = 2^137 bytes = 2^140 bits; therefore the minimum mass required to hold the bits would be (2^140 bits) / (10^31 bits/kg) = 136 billion kg.
> To operate at the 10^31 bits/kg limit, however, the entire mass of the computer must be in the form of pure energy. By E=mc^2, the rest energy of 136 billion kg is 1.2x10^28 J. The mass of the oceans is about 1.4x10^21 kg. It takes about 4,000 J to raise the temperature of 1 kg of water by 1 degree Celcius, and thus about 400,000 J to heat 1 kg of water from freezing to boiling. The latent heat of vaporization adds another 2 million J/kg. Thus the energy required to boil the oceans is about 2.4x10^6 J/kg 1.4x10^21 kg = 3.4x10^27 J. Thus, fully populating a 128-bit storage pool would, literally, require more energy than boiling the oceans.*
* Ibid.
Secondly, I recently tried to work out what year on the Top500 list[1] I could reasonably be for around US$5000. It's surprisingly difficult to work out mostly because they use 64 bit flops and few other systems quote that number.
> One million Claudes. To be able to search every book in history, solve math problems, write novels, read every comment, watch every reel, iterate over and over on a piece of code until it’s perfect – spend a human year in 10 minutes. 50,000 people working for you, all aligned with you, all answering as one.
We are already near the limits of what we can do if we throw compute at Claude without improving the underlying models, and it is not clear how we can get big improvements on the underlying models at this point. Surely geohot knows this, so I am surprised he thinks that "one million Claudes" would be able to e.g. write a better novel than one hundred Claudes, or even one Claude.
I’d say there odd a bit of a flaw in the read 50,000 books part though. The LLM reading that much doesn’t really get you 50k books of value as a person. You’re the bottleneck not the flops
I bring this up to present an alternate view of the future that a lot of thought has gone into: the Matrioshka Brain. This is basically a Dyson Swarm but the entire thing operates as one giant computer. Some of the heat from inner layers is captured by outer layers for greater efficiency. That's the Matrioshka part.
How much computing power would this be?
It's hard to say but estimates range from 10^40 to 10^50 FLOPS (eg [1]). At 10^45 FLOPS that would give each person on Earth access to roughly 100 trillion zettaflops.
[1]: https://www.reddit.com/r/IsaacArthur/comments/1nzbhxj/matrio...
This is all just such crazy coincidence.
Everything is coming together so quickly.