I suspect the progression will be "No AI until intuition (whatever that is for that skill)" -> "Gradual use of AI to understand where it falls short" -> "AI native expert".
How to actually implement this at scale is still TBD ;-) Ironically, AI will be invaluable for this e.g. as a hyper-personalized tutor but it will also present an irresistible temptation to offload the hands-on practice. We already have studies indicating the former is helpful but the latter stifles mastery. At this point I can only see self-discipline as a mechanism to willingly avoid AI.
Unfortunately, our testing-oriented education system only serves to incentivize over-reliance on AI (Goodhart's Law etc.) None of our current institutions and processes are suited for what is already happening and will only accelerate from here on. Things will need to change radically.
For this reason, I once predicted apprenticeships will be a thing again, and already there are signs with Microsoft's preceptorship proposal: https://dl.acm.org/doi/10.1145/3779312
This is highly encouraging because a tech giant is not only acknowledging the problem, but proposing a solution. Not a complete solution by far but at least a start.
The same ethos makes sense with AI, it's just that every company is trying to avoid paying that training tax. Why turn a junior into a senior yourself if you can get the competition to pay for it instead.
https://htmx.org/essays/yes-and/
Everyone else: we must let the juniors write the code.
Seniors come from juniors. If you want seniors, you must let the juniors write the code.
Pretty much all software projects seem to peak, and then decline in quality. There are only a handful of senior devs in the world who are actually good programmers.
"it's not x, but y", with bonus em-dash:
> your value as a developer is not in your ability to ship code. It’s in your ability to look at code
"But here’s the thing."
"And honestly?"
I hired a junior "dev" who literally hadn't even completed an HTML course. Before AI I could not have hired them because they literally did not know how to dev. After AI, anyone with a little grit can push themselves into the field pretty easily.
As with everything in life: you can choose to hard route or you can choose the easy route and your results will follow accordingly.
You don't get technical creativity reflexes by using AI. This is technical stagnation in the making. By cannibalizing its own sources, AI is ensuring that future generations are locked-in subscription models to do the most basic technical tasks. This is all obvious, yet we speed up every chance we get.
I am not a Python developer or developer for the matter but Perplexity AI did help me to understand the bsic of Python for API requests and get projects delivery with 94% code coverage and vulnerability free.
AI also reduced the time spent with Ansible playbook generation, but I do know Ansible, I do know Linux, homelab is my hobby so I am not just doing copy and paste. I review whatever it generates and correct it when required.
In the companies I have worked and work, I see developers themselves confessing "I used AI, it works but idk how"!
AI itself does not make you useless, only and only if you used it as a smart search engine.
If you are doing copy/paste, you are going to get so screwed professionally speaking.
Folks are no longer learning and what they are doing, AI can do on its own.
That is making some developer useless.
Answer: juniors need to work with seniors, and the seniors need to teach the juniors, and the juniors need to learn to use LLMs to learn, not just to do the work.
I stopped doing scientific research years ago, but before moving on to other things, I had, like many others, I imagine, certain problems that I wanted to study, but given the lack of time and other concerns, I would never have picked them up again. I launched Codex, and it managed to untangle old files and analyses, find datasets that I didn't know where they had ended up, launch analyses under my guidance, and build visualizations that would have taken me days, if not weeks, to complete.
Of course, I have experience, I know what needs to be done, and I had to correct some errors made by Codex (I am paying for Codex and Gemini now, but I could go back to paying for Claude too), but I was amazed by the quality of the analyses.
To give an example, I had a dataset of weather observations that I had downloaded from a website years ago, hundreds of time series across weather stations. Codex managed to recover the missing time series, even though the website is no longer active, by first comparing the downloaded data with the data I had, and then also finding a digital elevation model.
Now I will guide Codex in developing a model of extreme events that will allow me to have a spatio-temporal model of extreme events that, without Codex, I would never have had the time or inclination to build.
The worst is, they think they know exactly what they need to learn, and also think they can make good decisions.
- can you add an index page that shows the list of every post ever written? this page requires too many hoops to navigate https://beabetterdev.com/2026/02/
It takes time to become a junior too. Emerging tech landscape could affect skills and knowledge that is expected from entry level job applicants.
Steve Jobs famously accurately called this out years ago [1].
Xerox, Boeing, PC manufacturers (who basically created the Taiwanese makers through a series of short-term outsourcing steps), etc. But there are two examples I want to talk about specifically.
First, one lasting impact of the 2008 GFC was that entry-level jobs disappeared. This devastated a generation of millenial college graduates who suddenly had a mountain of student loan debt (thanks to education costs outpacing inflation by a lot) but suddenly no jobs. It became a bit of a joke to poke fun at such people who had a ton of debt and worked as baristas but this was a shallow "analysis". It was really a systemic collapse. Those entry-level workers are your future senior workers and leaders. Those jobs have never come back.
The rise of DVR/TiVo and ultimately streaming brought on a golden age of TV in the 2000s. It was kind of the last hurrah for network shows that produced 22 episodes a year before streamers instead produced 8 episodes every 4 years.
But what made this system work was an ecosystem. Living in LA, Atlanta and a few other places was relatively cheap so aspiring actors and writers and entertainmnet professionals could get by with secon djobs and relatively low income. These became the future headline actors and senior professionals. Background work and odd jobs were sufficient. Background work also taught people how to be on a set.
Studios still had large writing staffs. Some writers would be on set. Those writers were your future producers and showrunners.
Part of what supported all of this was syndication. That is, networks produced shows and basic cable channels would pay to rerun them. Syndicating some shows was incredibly profitable in some cases (eg Seinfeld).
So the streamers came along and stripped things down. They got rid of junior positions. They adopted so-called "mini writing rooms". Those writers didn't tend to ever be on set. The runs were shorter and an 8 episode series couldn't support a writer in the same way a 22 episode series could. The streamers then were largely showing just their own content so residuals and syndication fees just went away.
All of this is short-term thinking. Hollywood has been both a massive industry and a source of American soft power internationally by spreading culture, basically.
I think the software engineering space is going through a similar transformation to what happened to the entertainment industry. A handful of people will do very well. AIs will destroy entry-level jobs and basically destroy that company and industry's future.
I predict in 10-20 years we'll see China totally dominating this space and a bunch of Linkedin "thought leaders" and politicians will be standing around scratchin their heads asking "what happened?"
It may be AI has raised the bar, but also that junior devs out of uni are just much worse than they used to be.
We were just pretending otherwise, now it is explicit.
I bet the number of successful junior devs is going to keep going up while the number of people coasting slowly tends to zero.
Security gets outsourced to audited layers and Ai does the stupid boring jobs of gluing them together. Some developers become more specialised and niche, some pivot to product, some pivot to other areas.
There are plenty of people who joined software for the payout and hate it. Plenty of people who grown to hate it over time.
I've been enjoying using it to figure out toy projects but paying an API and depending on a service to code is very sour. I really hope hardware specialises and local models become good enough. Gate keeping development on centralised services would be a loss for everyone and ripe for dystopian outcomes.
I am a Junior dev (graduated in 2022) and I am gainfully employed at the McGowan institute, earning an okay salary, with colleagues who actively use LLMs, but there is zero talk of firing me or laying me off due of LLMs. I personally avoid LLMs for most things other than:
1) Google search which actually works 2) translating MATLAB, which I have never learned (and probably won't ever)
There is a whole team of Junior devs, and actually on Friday I got an email asking if I could refer another junior/entry level. Granted it was for a 1 year contract, but still.
I really do not get this hype.
First of all, developers who only learn to code in a short bootcamp are often not well prepared — but that was already true before AI. In the past, many junior developers were students who were learning programming while studying, not just people who took a quick Python course on Udemy.
Instead of declaring junior developers useless, we should raise the standard: learn how to code properly, how to maintain code, understand networks, and build strong foundations in math and computer science. A well-trained junior developer is still extremely valuable and will always be needed.
There are lots of ambiguous situations where a search and human "inference" can solve that AI still can't.
I can tell the AI to do something, it uses the worst approach, I tell it a better way exists, it says it validated it does not, I link to a GitHub issue saying it can be done via workarounds and it still fails. It's worse for longer tasks where it always shortcuts to if it fails pick a "safe" approach (including not doing it).
Funny enough we need the junior to guide the AI.
Companies will continue to demand it (I know people working at companies that are literally looking at AI usage as an individual performance metric, puke emoji), and probably 95% of humans using pretty understandable human logic aren’t going to work harder than they need to on purpose.
I wish I had a solution. I think the jury is still out on whether programming will be a dead profession in a short number of years, replaced by technical protect operators.
I’ve experienced similar things and so understand the feeling, but this is poor leadership. If someone on your team makes it all the way to a code review and still thinks ‘the AI suggested it’, you failed to train them, failed to set expectations and they have justifiably lost more confidence in you than vice versa.
If we analyze the rest of the article through the lens of weak leadership, it sounds less like an AI problem and more like a corporate leadership problem.
Eg. When using Ai Deep Research for hard to debug issues, asking for the why makes for a much better response.
Invest in the training of your junior employees.
The cost of generating code is now laughable, so that's not the economic value brought to the table by a junior engineer, or really, any engineer. The value is now generated by knowing what code is good code. You're going to have to have talks, book clubs, hackathons, and the like to get your juniors to know what good code is. Do they know what design patterns are? How about good architecture? If they can't name a few design patterns, you're not investing enough.
I could care less about why either Claude, Codex or before that a developer was using a for loop or a while loop. I did and do care about architecture.
I’m no more going to review every line of code with AI than I am when I was delegating to more junior developers. I’m going to ask Claude Code about how it implemented something where I know there is an efficient way vs naive way, find and test corner cases via manual and automated tests and do the same for functional and non functional requirements.
In a world where "Code is no longer a skill," the only way to survive is to stop being a "Prompt Operator" and start being a "System Auditor." If you can’t explain the trade-offs of the architectural pattern the AI just gave you, you aren't an engineer, you're just the person holding the screwdriver while the machine builds the house.