The number of programmers has changed so much, from ~zero in the 1940s to tens of thousands in the sixties, to, what, maybe 30 million today? While most programmers worked a little or a lot in ASM from invention until the 1980s, it's a very specialized role today.
I do not believe that 'roughly similar numbers of people were employed writing' ASM, C, and Python except for the instant that C outpaced ASM in the seventies and when Python outpaced ASM somewhere around the millennium.
Probably at no time were ASM, C, and Python programmers even close to similarly numerous.
Arguably, with the increase in literacy, Jevon's paradox would say we need to hire more writers. Indeed, a lot more people DO write for their job.
But its not like we went from a handful of professional, full-time scribes, to 10x professional full-time scribes. Instead, the value of being just a scribe has gone down (unless you're phenomenal at it). It stops being a specialized skill unto itself. It's just a part of a lot of people's jobs, alongside other skills, like knowing how to do basic arithmetic.
Coding, like writing, becomes a part of everyone's job. Not a specialization unto itself. We will have more coders, but since everyone is a coder, very few are a captial C "Coder".
Except that there are still a lot of assembly programmers.
And even more C/C++ programmers.
It's also likely that C/C++ programmers didn't become Python programmers but that people who wouldn't have otherwise been programmers became Python programmers.
> At the well-specified end, you have tasks where the inputs are clean, the desired output is clear, and success criteria are obvious: processing a standard form, writing code to a precise spec, translating a document, summarising a report. LLMs are excellent at this
Yeah
> At the ambiguous end, you have tasks where the context is messy, the right approach isn’t obvious, and success depends on knowledge that isn’t written down anywhere
Sounds like most programming
Almost all of the programming I've ever done.
> I’m arguing that the most likely outcome is something like “computers” or “the internet” rather than “the end of employment as we know it.”
Yeah
The things that most people ignore when thinking about AI and health is that 2/3rds of Americans are suffer from chronic illness and there is a shortage of doctors. Could AI really do much worse than the status quo? Doctors won't be replaced, but if we could move them up the stack of health to actually doing the work of saving lives rather than just looking at rising cholesterol numbers and writing scripts?
It has become difficult to grade students using anything other than in-person pen and paper assessments, because AI cheating is rampant and hard to detect. It is particularly bad for introductory-level courses; the simpler the material the hardest it is to make it AI-proof.
This isn't a new take. The problem is, "boring" doesn't warrant the massive bet the market has made on it, so this argument is essentially "AI is worthless" to someone significantly invested.
It's not so much that people aren't making this argument, it's that it's being tossed immediately into the "stochastic parrot" bunch.
That's just simply not true.
> If agentic systems start successfully navigating the illegible organisational context that currently requires human judgement, things like understanding unstated preferences, political sensitivities, and implicit standards, that would be significant.
How much of this is actually required for the actual work though and how much is legacy office politics "monkeys love to monkey around" nonsense?
This is nicely expressed, and could serve as a TDLR for the article, though buried in the middle.
We have the most automation we've ever had, AND historically low unemployment. We have umpteen labor saving devices, AND people labor long, long hours.
> Labour Market Reallocation Actually Works
It really does, given a little time.
Yes, this is still programming.
(Though I think syntax was most definitely a binding constraint.)
Precisely because of this, some people that couldn't code for whatever reason crossed the border and now can somewhat produce something substantially more complex then what they can comprehend. This causes problems. You probably should not shit out code you don't understand, or rely on code you don't understand.
How much of the world currently runs on horribly broken,outdated and inaccurate spreadsheets? Disposable AI slop apps are just a continuation of this paradigm. I do think that AI slop apps will become competitive with broken spreadsheets and 10,000 scattered python notebook files, causing a massive drop in need for various SWE,Analysis, Data Scientist type jobs. I've seen report systems with individual reports in the millions that were only run once (and many not at all), a huge percent of digital work is one off disposable items.
SWE is first and foremost a power structure for the people in charge. The first thing you do when you are in power is always minimize the amount of people with power you depend on. I think that AI already is reducing the amount of people needed to maintain and develop what are basically state of the company applications for the C-suite. Sure tech first companies will be hit less hard but for example the fortune 500 will be making continuous cuts for the next decade.