Not saying you should disregard today's AI advancements, I think some level of preparedness is a necessity, but to go all in on the idea that deep learning will power us to true AGI is a gamble. We've dumped billions of dollars and countless hours of research into developing a cancer cure for decades but we still don't have a cure.
> Economics gives us two contradictory answers simultaneously.
> Substitution. The substitution effect says we'll need fewer programmers—machines are replacing human labor.
> Jevons’. Jevons’ paradox predicts that when something becomes cheaper, demand increases as the cheaper good is economically viable in a wider variety of cases.
The answer is a little more nuanced. Assuming the above, the economy will demand fewer programmers for the previous set of demanded programs.
However. The set of demanded programs will likely evolve. So to over-simplify it absurdly: if before we needed 10 programmers to write different fibonacci generators, now we'll need 1 to write those and 9 to write more complicated stuff.
Additionally, the total number of people doing "programming" may go up or down.
My intuition is that the total number will increase but that the programs we write will be substantially different.
I feel this is a bit like the "don't be poor" advice (I'm being a little mean here maybe, but not too much). Sure, focus on improving understanding & judgement - I don't think anybody really disagrees that having good judgement is a valuable skill, but how do you improve that? That's a lot trickier to answer, and that's the part where most people struggle. We all intuitively understand that good judgement is valuable, but that doesn't make it any easier to make good judgements.
Additional code is additional complexity, "cheap" code is cheap complexity. The decreasing cost of code is comparable to the decreasing costs of chainsaws, table saws, or high powered lasers. If you are a power user of these things then having them cheaply available is great. If you don't know what you're doing, then you may be exposing yourself to more risk than reward by having easier access to them. You could accidentally create an important piece of infrastructure for your business that gives the wrong answers, or requires expensive software engineers to come in and fix. You accidentally cost yourself more in time dealing with the complexity you created than the automation ever brought in benefit.
I'm somewhat neurodivergent and got into tech precisely because it was a career where hyperfocus, compulsive systems building, passion for finding difficult solutions, etc. where valued. However, now it feels like those skills are no longer values; or even liabilities. As the article points out, companies now want me to "embrace commodity" and focus on plumbing code. However, those are precisely the areas that I'm not good at.
> We're not just <A> ... We're <B>
Is this proof of LLM writing or are people subconsciously picking up LLM patterns?
It's really hard to get an LLM to assist you when you don't know the right questions to ask. If your vision and convictions about the world are not strong enough, one plausible hallucination can take you into Narnia for an entire week.
However, there are other aspects to programming that can't be quantified, subjective components that are stripped away when delegating coding to machines.
The first most immediate effect I think is loss of the sense of ownership with code. The second which takes a bit of time to sink in and is at first buried by the excitement of making something work that is beyond your technical capability is enjoyment.
You take both of these out, you create what I could only describe as soul-less code.
The impact of soul-less code is not obvious, not measurable but I'd argue quite real. We will need time to see these effects in practice. Will companies that go all-in on machine generates code have the upper hand, or those that value traditional approaches more?
>We're not just experiencing technological change. We're watching the basic economics of software development transform in real time.
and I knew.
I also believe the difference between expert level and amateur programmers will be a lot easier to spot. Especially around the software or system architecture
Due to the inherent GIGO behavior of any software, it'll be a lot easier to tell that the code or software wasn't written by someone with deep knowledge since LLMs' output is just a good reflection of your prompts and creativity while at it(prompting). For instance, a prompt from Ken Thompson can't be exactly similar to your average Joe programmer. I've seen it a lot with most new open-source projects nowadays.
That's just the usual behavior of any market. When prices go down, demand goes up.
Jevon's paradox specifically says when prices go down by a factor of x, demand goes up by a factor greater than x, leading to total consumption (measured in money spent) goes up.
In the context of this post we can paraphrase (replacing money with programmers) like so:
Once AI makes it so developing any piece of software takes half as many programmers, the amount of software developed will go up by more than two times, leading to a net increase in the number of programmers required.
Likewise, as lighting takes less electricity and the devices last longer on average people use more electric lighting over time.
How would one even market oneself in a world where this is what is most valued?
"Every small business becomes a software company. Every individual becomes a developer. The cost of "what if we tried..." approaches zero.
Publishing was expensive in 1995, exclusive. Then it became free. Did we get less publishing? Quite the opposite. We got an explosion of content, most of it terrible, some of it revolutionary."
If it only were the same and so simple.
One limit on wages is $ of value / hour. If AI makes existing programmers more efficient, then you would expect total wages to go up.
If AI makes it easier for folks to become programmers, then the value produced could be split over more people. Alternatively, if you need fewer programmers then more value could be captured by a few superstar winners.
I suspect the reality around programming will be the same - a chasm between perception and reality around the cost.
Something similar might be happening in software. LLMs allow us to produce more software, faster and cheaper, than companies can realistically absorb. In the short term this looks amazing: there’s always some backlog of features and technical debt to address, so everyone’s happy.
But a year or two from now, we may reach saturation. Businesses won’t be able to use or even need all the software we’re capable of producing. At that point, wages may fall, unemployment among engineers may grow, and some companies could collapse.
In other words, the bottleneck in software production is shifting from labor capacity to market absorption. And that could trigger something very much like an overproduction crisis. Only this time, not for physical goods, but for code.
Not if you believe most other articles related to AI posted here including the one from today (from Singularity is Nearer).
I see it as a correction rather than deflation. We proved the shitty salesman of the 70s and 80s right. The computer really can be a bicycle for the mind; whether or not that bicycle actually fucking goes anywhere is still left to human wills.
Programming will still exist, it will be just different. Programming has changed a lot of times before as well. I don't think this time is different.
If programming became suddenly too easy to iterate upon, people would be building new competitors to SAP, Salesforce, Shopify and other solutions overnight, but you rarely see any good competitor coming around.
The necessary involvement behind understanding your customers needs, iterating on it between product and tech is not to be underestimated. AI doesn't help with that at all, at maximum is a marginal iteration improvement.
Knowing what to build has been for a long time the real challenge.
Only for economists who ignore the bulk of people and only focus on those who finance their luxury with loans.
What kind of also ignores the industrial revolution and all the progress since then.
I don't know this is indictment of Beck or economics, or both...
Maybe a few of them will pursue it further, but most won't. People don't like hard labor or higher-level planning.
Long term, software engineering will have to be more tightly regulated like the rest of engineering.
I saw this happening way before LLMs came on the scene in 2015. Back then, I was an ordinary journeyman enterprise developer who had spent the last 7 years both surviving the shit show of the 2008 recession and getting to the other side of being an “expert beginner” after staying at my second job out of college for 9 years until 2008.
I saw that as an enterprise dev in a second tier tech city, no matter what I learned well - mobile, web, “full stack development”, or even “cloud”, they were all commodities that anyone could learn “well enough@ so I wouldn’t command a premium and I was going to plateau at around $150-$160K and it was going to be hard to stand out.
I did start focusing on just what the author said and took a chance on leaving a full time salaried job the next year for a contract to perm opportunities that would give me the chance to lead a major initiative by a then new director of a company [1].
I didn’t learn until 5 years later at BigTech about promotions were about “scope”, “impact” and “dealing with ambiguity”.
https://www.levels.fyi/blog/swe-level-framework.html
I had never had a job before with real documented leveling guidelines.
Long story short, left there and led the architecture of B2B startup and then a job working (full time) in the cloud consulting department of AWS fell into my lap.
After leaving AWS in 2023, I found out how prescient I was, the regular old enterprise dev jobs I was being offered even as a “senior” or “architect” were still topping out at around $160K-$175K and hadn’t kept up with inflation. I have friends who are making around that much with 20 years of experience in Atlanta.
Luckily, I was able to quickly get a job as a staff consultant at a third party consulting company. But I did have to spend 5 years honing my soft skills to get here. I still do some coding. But that’s not my value proposition.
[1] Thanks to having a wife in the school system part time with good benefits I could go from full time to contract to permanent in 2016.