- AI will make developers irrelevant
- Why?
- Because LLMs can write code
- Do you know what I do for a living?
- Yes, write code?
- Yes, about 2-5% of the time. Less now.
- But you said you are a developer?
- I did
- So what do you do 95-98% of the time?
- I understand things and then apply my ability to formulate solutions
- But I can do that!
- So why aren't you?
The developers who still think their job is about writing code will perhaps not have a job in the future. Brutal as it may sound: I'm fine with that. I'm getting old and I value my remaining time on the planet.Business owners who think they can do without developers because they think LLMs replace developers are fine by me too. Natural selection will take care of them in due course.
Part of the practical degradation of traditional programmers over time has always been concentration and deep calculation, just like in chess. The old chess player knows chess much better than a 19 year old phenom, but they cannot calculate for that many hours at the same speed as before, so their experience eventually loses to the raw calculation. Maybe at 35, or at 45, but you are just not as good. Claude Code and Codex save you the computation, while every single instinct and 2 second "intuition", which is what you build with experience, is still online.
It's not just that it's a more fair competition: It's now unfair in the opposite direction. The senior that before could lead a team of 6 is now leading a team of agents, and reviewing their code just as before. Hell, it's easier to get the agent to change direction than most juniors around me, which will not be easy to correct with just plain, low-judgement feedback.
Based on my experience, I think this will prove more true than not in the long run, unfortunately.
Professionally, I see people largely falling into two camps: those that augment their reasoning with AI, and those that replace their reasoning with AI. I’m not too worried about the former, it’s the latter for whom I’m worried.
My mom is a (US public school) high school teacher, and she vents to me about the number of students who just take “Google AI overview” as an absolute source of truth. Maybe it’s just the new “you can’t cite Wikipedia”, but she feels that since the pandemic, there’s a notable decline in the critical thinking skills of children coming through her classes.
We have a whole generation (or two) of kids that have grown up being told what to like, hate, believe, etc. by influencers and anonymous people on the internet. They’d already outsourced their reasoning before LLMs were a thing. Most of them don’t appear to be ready to constructively engage with a system that is designed to make them believe they are getting what they want with dubious quality.
It also feels like the hiring "signal", which was always weak before, is just completely gone now, when every job you do advertise receives over 500 LLM written applications and cover letters that all look and feel the same.
The pro-athlete comparison in this article is bit silly IMO- there are obvious physical body issues that occur with aging if you rely on your muscles etc to make money. If you compare to other fields of knowledge work, such as say law or medicine, there are loads of examples of very experienced, very sharp operators in their 40s and 50s.
I certainly don't have the money or time to go back to college and start a new career at the bottom.
My parents were both construction workers. There is an understanding that you cannot lift heavy objects forever. You stop lifting objects and move to being a foreman, a supervisor... and if you are uncomfortable learning to get others to do work that you before have done yourself, you burn out your body entirely and the consequences are horrible.
This is factual reality, but it is also a parable that has been important for me to internalize about delegation in my own career. It is not irrelevant to AI use, but I don't think it slots onto it totally as neatly.
If we truly need to sacrifice our skill to be productive by using LLMs that atrophy us, then the only devs that have a limited lifespan are us. The next ones won't have a skillset to atrophy since they won't have built it through manual work.
Also, I hereby propose to publicly ban the "LLMs generating code are like compilers generating machine code" analogy, it's getting old to reargue the same idea time after time.
If you mean creating software, well we are creating more software than ever before and the definition of what software is has never been so diverse. I can see many different careers branching off from here.
> After being laid off, a programmer becomes a welder. One day while working, he suddenly muttered to himself, "It's been so long, I've even forgotten how to solve three sum". A coworker next to him quietly replied, "Two pointers".
This sounds ageist - I'm around 40 and feel I am at my mental peak, compared to even my mid 20's. This isn't a good analogy at all, the brain doesn't "wear out" like a professional athletes' body does, it just changes its structure. The brain is a remarkable organ.
> If AI does turn out to make you dumber, why can’t we just keep writing code by hand? You can! You just might not be able to earn a salary doing so, for the same reason that there aren’t many jobs out there for carpenters who refuse to use power tools.
The argument the piece makes is that being a software engineer who insists on writing code by hand may no longer be a lifetime career.
I think the definition of "software engineer" is changing, and it's not even changing that much. We construct software to help solve human problems. We can keep on doing that, just now we get to do it more.
Argument B: AI means you don't learn as much, and the single most useful work product of a software engineer is knowing how the code functions, so it's depriving your company of the main benefit of your work. Also, layoffs are terrible business strategy because every lost employee is years of knowledge walking out the door, every new hire is a risk, and red PRs are derisking the business.
Institutional and personal knowledge seem similar, but the implications of each are radically different.
While an exaggeration, it's kind of like the gap displayed in Ideocracy. You just get used to how things are and society kind of atrophies as a whole. Then you get the results of AI feeding itself, basically reinforcing the same mistakes that have the general affect of in-breeding. Eventually the more experienced will age out, and what gets left behind?
The code is relatively easy... it's properly understanding the problem and how a solution is meant to work that is the hard part.
But if it's a diversity of things (that use or leverage software development) then you probably have a lifetime career ahead of you.
I've been writing software for over 40 years but I've never seen myself as having a software engineering career. I've been a research assistant in geophysics, a marine technician on research ships, a game developer, an advisor to the UN, and a lot more. Yes all through that I used software, but I did a lot of other things in the process of using it.
Thinking that software engineers can be replaced by AI is like thinking that mathematicians can be replaced by calculators.
It might not look much like software engineering, but it's still going to be nerd stuff that most people don't want to bother with.
Yes, LLMs might dramatically reduce the amount of code we write by hand. But I'm a lot less convinced they'll solve all of the amorphous, human-interacting aspects of the job.
if you do that then... likely very replacable.
And read Programming as Theory Building already, it's not that long
At least personally, using codegen LLMs allows me to step into areas I'm completely unfamiliar with, produce value, and learn new things along the way. I just made changes to a FOSS Android app I'm using, and I'm relatively inexperienced in mobile. However, now I know soe Kotlin keywords, I know a bit about the UI libs, and know better how to build and test Android code.
So I think I don't learn less, maybe I learn the things that interest me.
More than anything, I believe that AI is pushing out those who enjoyed the ~act~ of programming more than the product being delivered itself. Mostly because those individuals might have the hardest time adopting this new way of getting things done.
And honestly, I feel for them. Coding has always felt like an art form to me. Nothing feels better than someone commenting on the elegance/beauty of something youve written.
Over the past two decades, there have been lot of solved problems like building boring scalable web apps, UX design etc and AI is fairly good at this, enough so that good prompting can get you very far. This shouldn’t be a surprise, there’s a lot of publicly available data for this (GitHub repos etc).
On the other hand, there are rarer Computer science problems like designing efficient Datacenters, GPUs, DL models. Think about problems that someone of Jeff Dean’s or James Hamilton’s (AWS SVP) or a skilled Computer Architecture researcher like David Paterson’s ability would solve. These are incredibly hard and rare problems and AI hasn’t been able to make much progress in these areas. That’s true for other sciences as well.
If you’re a regular Joe like me who builds boring CRUD apps, AI is coming for you.
What I mean is if you are working on incredibly hard and rare problems that require rare skills and also those problems don’t have publicly available data that LLMs can be trained on, you’re safe from being “automated” away. If not, you must plan accordingly. Also if you’re a skilled manager (in any field) AI cannot replace you, highly skilled managers that can get the best out of their teams have rare skills that aren’t easily replicable even amongst humans much less AI. Although, if going forward we need fewer developers we will need fewer managers too.
Not AI, offshoring combined with downsizing of US based engineering orgs.
Corp America has figured it out finally after 2 decades of entitled developers making 2 day tasks into 2 week tasks in the name of "best practices", "architecture" and "Doing It Right!" etc, all while commanding high salaries.
It turns out that Good Enough is in fact good enough and the people who write the checks are onto it. Even if its not quite good enough, cheap offshore resources can just be sent back to make it work. US based staff of 5 people who can be held responsible for guiding a much larger offshore group seems to be the common pattern.
All of this was imparted to me by a CIO on a recent interview with a financially strong mid sized company in the eastern US. The developers I interviewed with where EXCEPTIONALLY COMFORTABLE and displayed zero signs of any kind of stress from maintaining their literally 20 years out of date infra. It was insinuated that the team I interviewed with "probably wont look the same in 6 months" too.
That's a part of it, but only a small part. They don't get good at the thing mainly by doing the thing. They get good at it by training to do the thing.
An NFL football player does a ton of things other than playing in games. They have practice scrimmages. They do drills like throwing, catching, running patterns, tackling, reading quarterbacks, stripping balls, picking up fumbles, etc. They work with coaches on their technique. They watch film. They spend many hours in the gym and on the track building their strength, speed, cardio, and stamina.
Yes, it's true that your software skills will atrophy if you don't use them. But that doesn't mean your skills have to get worse and worse causing you to eventually quit the job. It means you need to set aside time to maintain your skills. It may no longer happen automatically as a side effect of your work, but it can happen intentionally instead.
"We may be in the first generation of software engineers in the same position. If so, it’s probably a good idea to plan accordingly."
He compares software engineers to pro athletes. What does it mean to plan accordingly? Start working with the mob to fix poker games? I don't know what "plan accordingly" means at all but it is a thought provoking statement.'If you work in construction, you need to lift and carry a series of heavy objects in order to be effective. But lifting heavy objects puts long-term wear on your back and joints, making you less effective over time. Construction workers don’t say that being a good construction worker means not lifting heavy objects. They say “too bad, that’s the job”.'
On another note, for sure software developers are saying things like "this is the part of the job that I like" or "if you aren't doing the work, you won't be good at the work." But other people are saying this, too. I just saw an episode of "Hacks" ("QuickScribbl") where the writers say pretty much this exact thing when confronted with AI tooling designed to "make their job easier". Is writing comedy also like lifting heavy objects at a construction site?
I've long regarded myself as more a master craftsman than an engineer, and I've had the pleasure of working on one-of-a-kind or first-of-a-kind things. Perhaps fortunately I'm near retirement. But I genuinely enjoy the coding: it's how I engage with the problem and learn to understand it. It's also how I ensure that I'll be able to read the code and find things in the code base when I come back to it years later. Last thing I want to do is spend my days overseeing someone (or something) else's code. If I wanted to be a manager of programmers I could have done that years ago.
The truth is that software engineering, as a profession, is not even a full hundred years old. Even if someone spent their all career with it, it has probably changed so much over time that it became a completely different job.
So far, we have barely scratched the surface.
So yes, software engineering may no longer be a lifetime career for a lot of people, much like elite sport is not a viable career for most—but still, some will, and must, make it their career.
Hand-coding -> llms/agents
Sometimes the only thing that can fit into a tricky spot is a screwdriver. The power drill didn't make screwdrivers obsolete, it just made them less necessary day-to-day.
Same thing here. LLMs are power tools, but sometimes, the only thing that can fit into a "tricky spot" with code/systems is knowing how to do it by hand.
> (2) AI-users thus become less effective engineers over time, as their technical skills atrophy
Wouldn't (2) imply that if everyone just used AI there eventually would come a time when there aren't engineers who will outcompete you (because their skills are so atrophied)?
Maybe you want a react app and using redux for state would be the best for the specific case but the AI doesn't recommend it and you don't know, then you are missing out and can end up with something suboptimal This was just an example
Maybe if you somehow stick with the same company for your entire career, it could feel somewhat similar... but I doubt it, as 'best practices' and many other things cause it to change.
The days of 'lifetime career' had already gone for most people, way before AI arrived.
People need to learn the difference between fluid intelligence and crystalized intelligence.
People need to hear that startup success is maximal when the founders are older, not younger. VCs chasing youth are statistics deniers.
> professional athletes & construction workers - work in physical fields which means there's physical limits to what they experience both in terms of what they do & what their body can do.
> software engineering is an art & engineering. which means as long you're of sound mind - you can do it till you die of old age or even if say you go blind. Because you ability to refine / taste is not dependent on your physical capabilities.
> llm's one shoting things - is not engineering because engineering is about compromising within constraints & using rules of thumb. so if you have no constraints u r not engineering.
That statement is enough of an evidence
That's the way. Anything else shows that they don't know how modern economy works. And let's admit it, as a bunch of IT/software people here we are terrible at this.
Virtually, the entire blog is about AI with a ridiculous publishing rate (https://www.seangoedecke.com/page/5), funny how I can look at this site HTML and know right away it was done with AI.
Can we stop upvoting vibe published articles? The arguments are flawed and don't even make sense to anyone who does software
Almost like we learned nothing from the bitcoin period where all the people that would make money if people invest in bitcoin constantly posted stories to HN about how amazing bitcoin is.
I love thinking about what kind of assembler the compiler may generate (though honestly, I haven't got a chance), I love thinking about how languages should be more dynamic (Who's got actually-first-class functions? Like, ones that you can build, compose, combine and manipulate to the same degree you can a string or a JSON object, no LISP, you're cheating, close no point).
And yet.. I don't care that much. Not because I'm late in my career (I'm 40, there's still some years left in me), but because I want to make computers do things, and what I enjoy doing is thinking up ways the things can happen, and sometimes the particulars that matter when making a lot of different things happen in a coherent system.. And yea, LLMs are trained on peoples output, and from what I'm seeing everywhere, is that people are overall fairly terrible at that, and most of the plumbing-type glue being written is not worth anyones time..
And I'm not saying I don't care because LLMs can't do my job (heck, even after hours of back-and-forth spec building and refining every little nook and cranny, the stupid coding agent still cheats or gets it wrong (even after it's beautifully explained, proven even, by reasoning and example alone, and on first try even) that the words coming after the previous words makes sense, as soon as the plan is put into motion, it'll mess it up on some scale so fundamental I should just have done it myself.. And I hope that changes, I hope that I don't have to go into such detail.. I hope to become a steward of taste rather than a code-reviewer.. I hope that I will eventually not be needed for that anymore.. I want it to replace me, so I can move to telling what I want, and have it made that way..
I hope I won't need to steward good taste, and that nobody will.. I hope the applications I use in 5 years will be a collection of one-offs, and gradually improving tools that was written _just_ for me, for my way of working, and my way of thinking.. I want to prompt the damn program to change itself as I discover new ways to do things, until it can eventually figure out how to automate the last bit of my task away.. And then I'll go do something else exciting.
I dont know, maybe in your part of the world, but where I'm from we have a series of robust worker protection laws that try to limit the damage the work does to you. We generally consider it a bad thing for workers to damage their bodies, and if we could build houses without it, we'd prefer that.
In this specific case we do have a techniques to build software without causing damage, so why change that?
This post is arguing that maybe software enginnering should start being harmful, even though we know it doesn't have to. It's a post of a guy begging to be fed into the capitalist meat grinder. Meaningless self sacrifice.
60 years total so far
construction has what, 6k years
I'll get into that next.
This requires having an understanding of a business domain, economics, human psychology and technology.
The competitive aspect of it means that you need to understand these things better than most people and machines. If you don't, then your skills have no value on the market. Will generalist AI trained on public data ever understand these things better than software engineers across every possible niche?
I don't think so because that knowledge is usually gate-kept. Nowadays, new engineers almost have to beg to be given access to knowledge of company systems. It takes at least 6 month for a skilled engineer to ramp up on large systems... And it's mostly because of institutional resistance.
The thing is, it doesn't even require people to be withholding information... Some engineers will happily share everything they know about internal systems... But in a big company; you first have to identify this person. That can take a while... Then you need to identify other persons who will give you other information that is relevant to your specific tasks/integrations. Then there are all sorts of other constraints and restrictions to deal with.
You can't just deploy an AI to a big company and it will magically guess all the endpoints which exist... You have to tell it what is available and enterprise systems are not designed for transparency.
Big companies resorted to a kind of security-through-obscurity. This used to be considered bad practice 10 years ago but at some point they just gave up, let complexity run amok and started calling it "multiple layers of defence" but now this obscurity is a problem for evaluating system security (too much unknown context is required, nobody fully understands the entire system) and it slows down development and maintenance as well.
Whoever knows the most context about a system has the advantage... And this isn't necessarily a company insider. Most likely, the people with the most context are platform providers.
I predict that most major hacks will originate from platform providers. We already started seeing this with Axios hack (originating from GitHub/npm) and Vercel (originating from Google Workspace).
The centralization risk is massive because each platform is servicing so many large companies. It only works when there is perfect incentive alignment but that's not usually the reality during difficult economic times. Third-party platforms cannot be trusted anymore.
I'm greatly anticipating the next Great Leap Forward™ with a publicly available Mythos or other new paradigm I can't currently imagine
but at the moment, agentic coding has made me busier than ever before, while its Product Managers, UX, QA, Data Scientists and DevOps that have disappeared from the teams I'm on - across multiple organizations - and I have to do all their work and make dashboards that I didn't have to make as well
All the projects that would have been cancelled by Q3 are being attempted in Q1, means more work
Im looking at proper engineering in building local LLM networks, with proper firewalls, capability access, and guards around the LLM systems to allow and enable advanced use while not just "lol delete everything" happening.
When theres a land grab, move to selling tools and how to knowledge work in maintaining the tools and proper operation and maintenance.
I also look at upsells like local LLMs as reason to do this in house, so that companies arent liable for rug pulls and violation (consumption) of trade secrets, or breaching confidential discussions.
And LLMs arent good at recommending tech stacks for running them. Stuff is moving faster than most data training sets have.
If your crowning achievement is: "I can 100% all leetcode hards" I have bad news for you.
While most developers were busy grinding, the corporations did the most ensuring the only sensible pathway to wealth and development is closed = running own business that is. In many countries, due to regulatory capture enacted by corrupt governments, making profit is next to impossible, that if you manage to jump bureaucratic hurdles that are not present for larger corporations.
AI is just a tool. Will AI replace software engineer is like asking will hammer replace the carpenter?
More AI Soothsaying. Not so hard on the Inevitabilism this time.
Less "pure" programming, but lots more programming in general.
It's a tool for knowledge work.
No carpenter is a specialist in drills.
It seems to me that the best way to navigate a long term career is to have another specialty and use software engineering as a tool within that specialty.
If talking to an AI makes me dumber and a limited career, then all the customer support people that ever existed were in the same or worse position talking to dumb humans on chat all day answering tickets always about the same topics and linking the same docs over and over. This makes no sense.
If you believe this about your software career, how do you think your going to switch into another career as a junior and keep up?
Other professions do too, whether it's healthcare, etc.
Software being a new field, didn't really become a standardized profession in the way engineering might be.
The goalposts are moving because the standards are moving, because the capabilities are moving.
Remaining a self-directed learner will remain critical.