Employment in the 2020-2022 range was highly unusual due to COVID stimulus the resulting unprecedented hiring. Tech companies were hiring anyone they could and after some time juniors were the only way to feed the insatiable demand for more headcount.
Comparing to this time without taking that into account is going to be misleading.
This period was also a strange time for remote work. I’ve been remote since before then, but COVID era WFH felt like a turning point when bad behavior during remote work became normalized. That’s when we started having remote hires trying to work two jobs (and giving us half an effort / not getting their work done), and there was a rise of “quiet quitting” as a news media meme because everyone thought they could always just walk out and get a new job if they got fired for not working. We also weren’t doing juniors any favors by hiring them in high numbers without a sufficient ratio of seniors to mentor and lead them.
That also coincided with the rise of GitHub Copilot and ChatGPT. These tools were not great at the time, but if you were a junior who was over-hired into a company that didn’t have capacity to mentor you and you were working remote in the age when Reddit was promoting quiet quitting and overemployment on your feed every day, banging out PRs with GitHub Copilot for a couple hours a day and then going about your life for a $135K salary right out of college felt like you just hit the jackpot of historical confluences for work-life balance.
I saw this exact story play out at multiple companies who got burned out on the idea of hiring juniors due to the risk. Combine that with the rapid improvement of the LLM tools and the idea quickly became that you just hire seniors and treat the LLMs as juniors rather than paying another salary for them to pilot Claude Code around. The seniors had to review the Claude Code output anyway, so why not cut out the middleman?
Then add the economic downturn and the chaos of whatever this administration is doing this month and now there are so many qualified seniors on the market that hiring juniors is hard to justify. This is the part that would have happened with or without AI.
All things considered, being down only 20% from the 2022 peak seems not that bad.
There is no reason people have to tolerate a technology that is destructive to society, anymore than they have to tolerate companies selling fentanyl at 7/11.
Times change, the ladders you and I climbed to success may not be around in the same forms for our children. That's not new. But will there be any ladders to climb if the bottom rungs are all gone?
Nathan Witkin has also posted a response to my article here https://open.substack.com/pub/arachnemag/p/the-jevons-parado...
> So what’s the mechanism at play? AI replaces codified knowledge
Many job postings peaked in 2022 due to the pandemic. The original paper tries to account for this but falls short in my opinion.
Original paper said[1]:
> One possibility is that our results are explained by a general slowdown in technology hiring from 2022 to 2023 as firms recovered from the COVID-19 Pandemic...
> Figure A12 shows employment changes by age and exposure quintile after excluding computer occupations...
> Figure A13 shows results when excluding firms in information technology or computer systems design...
> ... These results indicate that our findings are not specific to technology roles.
Excluding computer and IT jobs is not enough in my opinion. Look at all these other occupations which had peak hiring in 2022.
Nursing jobs in the US: https://fred.stlouisfed.org/series/IHLIDXUSTPNURS
Sales jobs in the US: https://fred.stlouisfed.org/series/IHLIDXUSTPSALE
Scientific research & development jobs in the US: https://fred.stlouisfed.org/series/IHLIDXUSTPSCREDE
Baking & finance jobs in the US: https://fred.stlouisfed.org/series/IHLIDXUSTPBAFI
[1] https://digitaleconomy.stanford.edu/app/uploads/2025/12/Cana...
I agree with this sentiment, but history shows that humans are absolutely terrible at planning for revolutionary systemic changes like this. Our current inability to address climate change in any systematic way is just the latest example. It seems to me that if and when human labor becomes superfluous it will most likely result in a lot of chaos before a new system emerges.
it doesn't follow that all software engineers are excellent at other work, please don't take that from my quip. but i could see the pattern, over time, being large enough to identify.
since software engineering jobs historically are very well paid, it does give some plausibility that former engineers working for less money would have this displacing effect.
its all icky no matter what i think, maybe someone else can tell me why i'm wrong and cheer me up
The reality is that higher interest rates hit software particularly hard because less venture capital is being thrown at traditional software development. When money is tight, cutting new hires decreases to push off layoffs, and when layoffs happen experienced potential hires become cheap, displacing inexperienced entry level hires. No one is telling their boss "reduce my budget, the AI is so good I don't need these people anymore" they are getting told by their boss "find a way to make due with 3 fewer people." We should expect overtaxed workers to try and find ways to utilize AI to take up some of this slack, and higher ups may spin a tale of increased efficiency, but the fact is AI adoption is a symptom, not a cause. The hiring decrease and layoffs happened at plenty of places that have failed to adopt AI as well.
Given that the current situation is unique to the circumstances, it does not hold that software portends the fate of all white collar work. That being said, we can certainly expect AI to improve, and attempts to be made to replicate and improve upon any genuine efficiency gains made in the present experiment. But the fact is that while AI may make certain tasks easier, that will lead to reorganization of the labor force more than disappearance. When mechanization of agriculture reduced the labor required to produce enough food to sustain people, people stopped being farmers. It was a major societal shift, and there were certainly issues, but we don't have 90% of our population made up of unemployed farmers who can't afford to buy food, nor even a large percentage of the population who wants to farm but is forced to work a much less desirable job.
Comparative advantage will guarantee people are still doing something. There will always be tasks which would benefit from human input, and there will always be more such tasks. We may not currently place much value in these tasks, but by virtue of AI doing the other tasks, the relative value of fully automated tasks will decrease and the tasks which require human labor will become more highly valued. In a world where the best paid people are ditch diggers, and ditch diggers can afford yachts because yacht production is fully automated, who cares what the wage of the ditch digger actually is?
Wealth concentration is a concern, but not because it will make it impossible for the vast majority to live a decent life. Instead the economic lives (and likely socio-political lives as well) of these two groups will simply diverge. This is extremely concerning from a standpoint of justice, but it's really orthogonal to AI. We've had such aristocracies many times before - they arise because of a failure of social institutions, not technology. We've been on the path towards them long before AI came along, and there is no compelling evidence that AI has accelerated the process. As far as economics is concerned though, your quality of life will continue to improve, even if some billionaire's improves faster.
https://www.robpanico.com/articles/display/the-answer-isnt-m...
This is confounding AI-exposed white collar occupations with occupations that were overrepresented with extended remote work.
I am on multiple boards and that was a major factor that disincentivized new grad hiring in the US, because a new grad salary in a white collar profession in the US is a mid-career salary in the rest of the world.
AI is used as an excuse, but even most executives when polled agree that we do expect to see the amount of employees being hired at least in software adjacent roles to increase.
I cannot justify hiring a mediocre new grad in Seattle for $120k who will end up using Claude Code anyhow when I can hire an early-career employee doing something similar in Romania or India for around $20k.
The reality is a large portion of new grads and mid-career types who started their careers after 2020 are too mediocre for the TC paid.
---
Edit: pulling a comment of mine from downthread
> Why are then so many US developers still employed
Becuase unlike the HN hivemind, a large portion of experienced developers in the US have found ways to realistically adopt new technologies where they are relevant.
Reflexively being an AI fanatic or Luddite is stupid, but being a SWE who is able to to recognize and explain the value of these tools and their limits is extremely valuable.
I can justify paying $300-400k TCs and 100% remote work if you are not a code monkey. This means being able to architect, manage upwards, do basic design and program management, hop onto customer calls, and keep upskilling on top of writing, testing, and reviewing code.
We are not hiring SWEs to only push code. We hire SWEs in order to translate and implement business requirements into software.
A developer who has a mindset like that is worth their weight in gold, and there are still plenty of these kinds of experienced developers in the US.
Writing code by hand is not going to be the default mode going forward. You either do the majority of your work controlling autonomous agents and reviewing their work or you get surpassed by all of your colleagues.
Are you going to be the farmer who refuses to buy a plow?
I also do not have sympathy for those who refuse to adapt. These people hold back organizations by appealing to tradition and resisting any form of change.