This is the lump of labor fallacy- the belief that there’s a fixed amount of economically valuable work that technology and capital can eliminate through automation or capital accumulation instead of transforming it.
Middle class status anxiety manifesting as a rhetoric about neofeudalism.
It offers no constructive alternative and the author (yes, I know who he is) seems to have no issue with Google hosting their email.
It's hard to take this too seriously (even if there is some legitimate worry here)
The window to fix this is closing.
We can see this logic reflected at times in business history. Ford paid workers double the daily wage so they could afford the cars they built and Costco pays employees 50% more than Walmart. They're not doing these things out of the goodness of their heart but out of greed to increase long term profits.
It's difficult to see where it might head that doesn't lead to population collapse and some form of dystopia.
https://www.noahpinion.blog/p/plentiful-high-paying-jobs-in-...
Also, my p(doom) is 1.0-epsilon under the status quo without AGI/ASI, due to old age and disease. Under some assumptions, self-interest says that I may as well roll the dice.
I find it quite interesting, and somewhat disturbing, that we've so quickly come to the point of seeing the AI power drivers as openly adversarial to people and deeply entangled with equally adversarial government forces.
But are we actually (and realistically) talking about technofeudalism in the next couple of decades?
There’s a lot of fear around what will happen with AI, not so much of extinction but rather of two things: fear of losing income, and arguably more importantly, fear of losing identity.
People often are invested in what they do to the point that it’s who they are. That being replaced or eliminated might be a bigger psychological threat than lack of income, at least to those of us fortunate enough to be well off right now.
However, these threats are outweighed by the benefits that AI can eventually bring. Medical advances, power generation, manufacturing capability. Our systems for running society have a lot of problems, economically, politically, epistemologically. These can also be improved with AI assistance.
The real problem is the transition, it’s such a huge shift, and it will happen all at once to everyone, uprooting our idea of the world and our place in it.
What we need is to embrace AI and find a way to make sure that the transition and benefits of AI are distributed instead of concentrated.
For me this looks like the following: companies must commit to retaining some minimum number of employees in every currently existing function, to be determined proportional to their profit taking. This sets a floor on the job losses that can come later when AI really comes on stream.
The justification for this is three fold: firstly, it’s a safety mechanism, it ensures that regardless of the capabilities of an AI system, there are multiple humans working with it to verify its results. If they aren’t verifying diligently, then they’re not doing their jobs.
Secondly, jobs aren’t just a way of making income, they’re wrapped up in identity and meaning for at least some people, and this helps to maintain that existing identity structure across a meaningful cross section of society.
Third, it keeps the economy running, money circulating. You can’t have a market economy without consumers. UBI is one component of this too, but this is both more direct, more useful and more meaningful.
A: participate and have a chance to not be part of a perpetual underclass
B: for moral reasons, don't participate, be part of the underclass
I kinda would have hoped for
C: <something> to stop this from happening
Otherwise it's the worst sales-pitch ever
Such situations usually correct themselves violently.
I left the UK for this reason and live very comfortably on around £15k. I rent a city centre flat with 600 megabit fibre and really good amenities. I have time and space to build what I want.
"Give me the place to stand, and I shall move the earth." - Archimedes.
Unfortunately in the UK it's really hard to survive, let alone actually have time to do anything meaningful. I don't know if it's engineered by big tech/property/finance or some other demon. Maybe the monster in qntm's "There is no Anitmemetics Division" is allegorical.
This is echoing a term made by Varoufakis about an increasing amount of money being held by a smaller and smaller group, not a return to literal peasant existence. It’s not feudalism, it’s ‘neo-feudalism’.
The argument that labour can move is true, except where it can’t. Look at the entire towns of miners made irrelevant with no replacement to their jobs. Sure you might say they can move half way across the country to clean toilets but they have skills, a family and houses somewhere else.
Where the argument of a feudal analogy really rings true is the increasing attempt to do back to extraction of rent for everything. Subscriptions for everything, including homes are becoming more and more normal. Are we really okay with a world in this form?
It's not always the case that given an opinion, both extremes are wrong. But in this specific case, it certainly is. Neither the "LLMs will usher in the post-scarcity economy" view nor the "LLMs will doom everyone to unemployment" view (which are remarkably closer to each other than it would appear at first glance) are correct. LLMs are a useful tool with inherent, fundamental limitations that mean that they will never be able to do everything. AGI is currently a pipe dream, and LLMs are not going to be the technology that achieves it.
The nub is something I've thought about before. My contingency plan for AI turning the industry I work in upside down is to make hay while the sun shines before that point. Have enough saved or invested for a (lean) retirement (depending on how far away that point is).
But what if AI turns every industry upside down. Will there be enough overall economic activity to actually invest in at all. Then we're all poor regardless of how much we've individually saved, or what kind of social safety net exists, simply because there's not enough economic activity to fund it.
That is, at the moment, and I hope forever, a very remote possibility. For a whole host of reasons, technological and economic ones. But if that did happen in the next 20-30 years...
- "In the future, when labor is fully marginalized..." Hasn't happened in the history of the world, not going to happen in the future either. Some forms of labor were replaced by machines, which then gave rise to new types of jobs, such as building and maintaining the machines. The human cost cannot be neglected, because many people do find it difficult to retrain to other jobs. But on the whole, there are more jobs and higher-paying jobs now than there were a hundred years ago. Higher-paying not just in absolute financial terms, but also in terms of what can be purchased with that money. The richest man of the 19th century couldn't buy an air-conditioned house, not with all his millions.
- "GPT$$$ is surely smart enough to separate you from whatever you have..." Assumes an unbounded growth curve in the "smarts" of AI, and worse than that, assumes that that AI will take the form of an LLM. This is laughable. LLMs will not ever achieve AGI; they are simply not capable of it. If AGI is achievable at all (which I doubt), it will come from one of the currently-neglected avenues of research whose funding is currently being neglected because LLMs are sucking all the metaphorical oxygen out of the room.
- "the neofeudal world": assumes that all companies are like that. Yes, there are many companies that suck to work for because they treat their workers as mere cogs in a machine, instead of as human beings. But not all companies operate that way. If you are being treated as a cog in a machine, start looking for opportunities to jump ship to a better working environment. I've worked in both types of places, and I would be willing to take a big pay cut to work for company that didn't treat me as a cog. They're out there, but it might take some looking. Tip: ask employees what it's like working for the comapny, don't just take the interviewers' word at face value.
If one feels morally compelled to pay with their own income to stop these parasites, hats of to them. It is nasty that people are being put in this position. They need support from a society. Society needs information and debate about real issues, which requires them to be free from the barrage of falsehoods and yellow journalism. And possible it would help to have a Roosevelt, but culture is the biggest hindrance to change.
- Yuval Noah Harari, Ideas for the Future
This has been suggested a bunch of times in the comments of HN as well as on other social media, but what exactly would that look like?
As this seems like the ultimate prisoner dilemma and the winning solution there is always be first to make a deal, even if we accept the premise of AI turning us all into an underclass (a prediction often made with revolutionary technology I might add).
> capital is the only force,
be squared with
> A pile of money will buy you nothing in the neofeudal world.
and
> didn’t operate on capitalist principles
? Capital being the driving force is the very definition of capitalism. It's even in its name!
There are many things that can happen before we're all enslaved by AGI. It might well not happen. We might enter a war, or a cycle of civil wars that change society in a way we can't predict. Or, most probably, some jobs will disappear, some others will become available and AI will be a commodity. Just as machines did after the Industrial Revolution. It's extremely hard predicting the future. Telling people to "stop participating" (how? By quitting their job? By fighting the class war?) is a bit irresponsible.
Yes, the transition can be painful and some people will lose out and face hard career changes.
But overall, the multiplicative power of investment only increases, helping to make everything cheaper, and everyone richer.
People focus too much on their own small part of experience - like Claude Code replacing CRUD developers. Without appreciating that the LLM revolution (and broader AI like AlphaFold) also includes PhD students that don't need to lose time programming tooling, interns that might have spent their time on tooling that can now use LLMs to learn faster and actually contribute to their fields, disadvantaged students that can learn more directly and in a personal way, without it being dependent on their physical location.
All of this means you get more experimentation, more ideas, and more successes.
1) Instead of LLMs, imagine large models trained end-to-end on ALL online content and the impact it has on public opinion and discourse. What about when everything is an algorithmic feed controlled by such a model under the control of the elite? You might be resistant(but probably aren't), but in aggregate this will be effective mind control over society.
2) Money directs human effort. Every quantum of bargaining power the worker/middle class lose due to being less needed is the reduction in our ability to have a say in who society should serve and how.
3) Don't forget regulatory capture is a thing. Not just a thing but happening as we speak. Are you still optimistic?
4) Tech is already addicting and ads are already everywhere even without technology that has a theory of mind.
5) Do not forget that humans are social creatures, power over others is not just an accidental byproduct of wealth. Once you're unnecessary for labor, what's left? Fulfilling sexual/emotional/social whims of the wealthy elite? Hunger games? Being a pet in a billionaire's human zoo city so he can brag about his contributions to humanity?
The system is flawed for different reasons. Tolerance for high vertical integration and oligopolies have seriously damaged the efficiency of the market and limited people's ability to disrupt. Capital concentration has created a new form of aristocracy. They have successfully lobbied to significantly weaken the mechanisms supposed to spread this money, notably inheritance tax. The Supreme Court has significantly altered how democracy functions by lifting limits on fundings and given far too much power to the richest.
The last forty years have basically torn down all the foundations Tocqueville saw as fundamental to the success of the young USA. People should fight to get things back on track.
AI is mostly incidental in that. It doesn't matter if AI temporary concentrates some wealth if the mechanisms for it to then be spread again are in place.
Capital leads to class difference, often immense class difference, which is not a claim against our society as primarily capitalistic but in favor of it. If you took away all the food grown in America and the clothes woven in Bangladesh and the laptops manufactured in China, there would be no Amazon, no Google, no Microsoft, no "technofeudalism." The economic base is still defined by the exchange of commodities, its just that the US does not produce many industrial goods anymore, so the US economy is mostly a service based economy. Chinese citizens do not experience their lifeworld in terms of service based industries, they are surrounded by mass markets and complex factories and very material evidence of mechanization which we often do not see directly in the West, only the end product. So to many Americans it feels like they live in a magical society where they click some keys on their laptop and food and clothes and whatever they need shows up on their doorstep--but there are real workers out there tooling all the machines and developing all the architecture to make those things appear, to reduce the basic struggles of life to give time for greater and more advanced forms of social organization beyond the need to survive.
This is not what peasants had; for them, despite having a relatively complex existence, a bad season could and often would kill their entire family. Or a raiding band would take all their food, or they'd die of the plague...life was far more tenuous, and the basic made of production was not commodity production, it was growing food and animal husbandry. International trade, artisanal crafts, and capital improvements on industrial production were nowhere near the level they were in even the early modern period. Nothing about our contemporary society resembles this way of living.
Addendum: The claim that somehow everyone in tech could just "stop," like consciously decide to stop creating things, is absurd. Amazon is very good at what it does, but it does not have exclusive control over the trade of all goods in the whole world. Rakuten is a major competitor in Japan, there are many other companies that have strong holds in their local markets. You take a Bolt in Germany, not an Uber. Chinese users can query DeepSeek, which is surely more proficient in Mandarin than ChatGPT. Even if a state uses its sovereign power to artificially control industry, it only slows the development of capital, since other states may allow their own companies and technologies to flourish, like China is doing now with its electric vehicles. If Amazon does not meet its projections, it fails, its employees all lose their jobs, Jeff Bezos might even go bankrupt. There is a constant pressure of competition.
As a worker, your goal should not be to arbitrarily stop working--you may not enrich others but you certainly won't be enriching yourself either. The goal should be to capture far more wealth that is the result of your labor. This is only possible through labor organizing, which does not permanently cease the means of production, it only takes control of them. But business continues and people still produce things and do services and enjoy the wealth of those things and services. One should basically desire to live in a wealthy, prosperous society. This article does nothing but ask workers to go into voluntary poverty; it is reactionary and backwards.
No gpt18Pro won't cost $1Bi dollars. The buck (or the bubble) will stop somewhere.
I'd be more worried for the people making trades run 1ms faster, they literally create no value to the world that is not something their own peers believe it
There are billions of people not knowing and not caring about what is chatgpt and while it might hit them hard, humans are more flexible and less impressionable than most people think (I mean, some people think it's the other way as well and they might be right in some situations)
capitalism is artificial intelligence. we dont control capitalism, capitalism controls us and through us builds its next vessel
He’s not wrong though, but he’s in a weird position to say that. Also, this post isn’t constructive in any possible way.