Was having a discussion the other day with someone, and we came to the same conclusion. You used to be able to make yourself useful by doing the easy / annoying tasks that had to be done, but more senior people didn't want to waste time dealing with. In exchange you got on-the-job experience, until you were able to handle more complex tasks and grow your skill set. AI means that those 'easy' tasks can be automated away, so there's less immediate value in hiring a new grad.
I feel the effects of this are going to take a while to be felt (5 years?); mid-level -> senior-level transitions will leave a hole behind that can't be filled internally. It's almost like the aftermath of a war killing off 18-30 year olds leaving a demographic hole, or the effect of covid on education for certain age ranges.
Going to throw out another anecdote here. At a company that a number of my friends work for (a fortune 50), they are currently making record profits that they loudly brag about during employee townhalls. They also are in the process of gutting multiple departments as fast as possible with little regard for the long term consequences. This is not the only company that I know of acting in this way (acting like they're about to go bankrupt when in fact they are seeing record profits).
To me the societal risk is that an entire generation of employees becomes extremely jaded and unmotivated, and fairly so. We used to work under the assumption that if our company is successful, then the employees would be successful. Record profits == raises for all, bonuses for all. And while we know that that connection was never that strong, it was strong enough to let us at least pretend that it was a law of universe.
That fundamental social contract is now at its breaking point for so many workers. Who can really blame people for putting in minimal effort when they have so much evidence that it will not be rewarded?
When I was starting, you were checked for potential as a trainee. In my case, options trading. They checked over that you could do some mental arithmetic, and that you had a superficial idea of what trading was about. Along with a degree from a fancy university, that was all that was needed. I didn't know much about coding, and I didn't know much about stochastic differential equations.
A couple of weeks ago, a young guy contacted me about his interview with an options trading firm. This guy had spent half a year learning every stat/prob trick question ever. All those game theory questions about monks with stickers on their foreheads, all the questions about which card do you need to turn over, the lot. The guy could code, and had learned a bunch of ML to go with it. He prepared for their trading game with some really great questions to me about bet sizing.
I was convinced he was simply overly nervous about his prospects, because I'd never met someone so well prepared.
Didn't get the job.
Now I can assure you, he could have done the job. But apparently, firms want to hire people who are nearly fully developed on their own dime.
When they get their analyst class, I guess there is going to be nobody who can't write async python. Everyone will know how to train an ML on a massive dataset, everyone will already know how to cut latency in the system.
All things that I managed to learn while being paid.
You gotta ask yourself whether we really want a society where people have to already know the job before they get their first job. Where everyone is like a doctor: already decided at age 16 that this was the path they wanted to follow, choosing classes towards that goal, and sticking with it until well into adulthood. And they have to essentially pay to get this job, because it comes at at cost of exploring other things (as well as actual money to live).
ChatGPT was pretty useless when it first released. It was neat that you could talk to it but I don't think it actually became a tool you could depend on (and even then, in a very limited way) until sometime in 2024.
Basically:
- the junior hiring slowdown started in 2022.
- but LLM's have only really been useful in a work context starting around 2024.
As for this point:
> According to very recent research from Stanford’s Digital Economy Lab, published in August of this year, companies that adopt AI at higher rates are hiring juniors 13% less
The same point stands. The junior hiring slowdown existed before the AI spend.
Let's say you hire your great new engineer. Ok, great! Now their value is going to escalate RAPIDLY over the next 2-3 years. And by rapidly, it could be 50-100%. Because someone else will pay that to NOT train a person fresh out of college!
What company hands out raises aggressively enough to stay ahead of that truth? None of them, maybe a MANGA or some other thing. But most don't.
So, managers figure out fresh out of college == training employees for other people, so why bother? The company may not even break even!
That is the REAL catch 22. Not AI. It is how the value of people changes early in their career.
Default "people have value because human attention solves problems", has become default "existing org structure has value because existing revenue streams are stable."
The idea of a company used to contain an implied optimism. "If we get capable people together, we can accomplish great things!" Now that optimism has been offloaded to the individual, to prove their worth before they can take part.
Wages for your typical engineer stopped going up 5+ years ago. The joke of senior FAANG engineers making $400k has been a meme for over 5 years. Yet, inflation has done over 20% in 5 years? Look at new offers for people joining the majority of positions available at public tech companies. You're not seeing $500k offers regularly. Maybe at Jane Street or Anthropic or some other companies that are barely hiring - all of which barely employ anyone compared to FAANG. You're mostly seeing the same $350-400k/yr meme.
The reason we're not employing new grads is the same reason as the standards getting much more aggressive. Oversupply and senior talent has always been valued more.
Amidst this influx of applicants, junior and intermediate staff began getting Senior titles to justify pay raises. Soon those exact same people were moving from job to job as a "Senior", but without the relevant criteria that would've qualified for that title a decade before. You can still see people get promotions without having accomplished anything, much less learned anything, but they did keep the lights on. Today there's a sea of "Senior" engineers that can basically write code (and not especially well), but lack all the other "non-coding" skills that Seniors should have.
Even if you hired 100K new Juniors tomorrow, there's nobody to train them, because most of the people working today are practically Juniors themselves. Each "generation" is getting worse than the one before, because they're learning less from the generation before, and not being required to improve. There's still good engineers around, but finding them is like playing Where's Waldo? - and you have to know what Waldo looks like, which you won't if you're not experienced!
The fix isn't going to be learning to network ("relational intelligence") and mentoring more. The fix is for us to stop letting the industry devolve. Treat it like the real engineering professions, with real school requirements, real qualifications, real apprenticeships, real achievements (and titles that aren't meaningless). Otherwise it'll continue to get worse.
The boom-bust recession cycle is roughly every 10 years. You can't say that AI is impacting hiring when your data just looks like the typical 10 year cycle. Your data needs to go back further.
That being said, what's more likely going on:
1: There are always periods where it's hard for recent college grads to get jobs. I graduated into one. Ignoring AI, how different is it now from 10, 20, and 30 years ago?
2: There are a lot of recent college grads who, to be quite frank, don't work out and end up leaving the field. (Many comments in this thread point out how many junior developers just shouldn't be hired.) Perhaps we're just seeing many companies realize it's easier to be stricter about who they hire?
1. The industry cannot define the terms junior or senior.
2. Most seniors today are the prior generation’s juniors with almost no increase of capabilities, just more years on a resume.
The article asks about what happens when today’s seniors retire in the future. I would argue we are at that critical juncture now.
If I were to graduate today, I'd be royally screwed.
The economics of providing every new grad a $150k TC offer just doesn't work in a world with the dual pressures of AI and async induced offshoring.
Heck, once you factor in YoE, salaries and TCs outside the new grad range have largely risen because having experienced developers really does matter and provides positive business outcomes.
State and local governments needs to play the same white collar subsidy game that the rest of the world is playing in order to help fix the economics of junior hiring for white collar roles. This is why Hollywood shifted to the UK, VFX shifted to Vancouver, Pharma shifted to Switzerland, and Software to India.
And although it hasn't discouraged me, I have to admit that I've been burned by juniors when caught in the middle between them and senior leadership on output expectations or strategy because frankly it's much more challenging to mentor how to navigate company politics than it is to mentor professional coding acumen. I want to be humble here. I don't think that's the junior's fault.
It feels like these problems go a lot deeper than AI. Most shops want software teams that are either silently embedded black boxes that you insert rough instructions into and get working software as output or an outsourced team. We've all experienced this. It seems silly to deny that it's directly related to why it's so hard to mentor or hire juniors.
It is also something which is likely to be quite harmful, since it selects for people who are great at networking over people who have good technical skills. Obviously interpersonal communication is important, but how well a 20 year old in University performs at it should not doom or make their career.
And even people with bad social skills deserve to exist and should be allowed into their chosen career. Being someone who does good work and is respectful, but not overly social, should be good enough.
It is insane how much screwed over we are. I am about to turn 30 soon with 5 YoE, PhD in ML which supposedly is the cutting edge stuff. Yet I have no prospects to even buy a tiny flat and start “normal life”. AI eats its own tail, I have no idea what I should do and what to learn to have any sensible prospects in life.
I am an older gen-z and launching my career has felt nigh on impossible. At my first job, the allergy toward mentorship this article mentions was incredibly palpable. None of my several managers had management experience, and one of them openly told me they didn't want to be managing me. The one annual review I got was from someone who worked alongside me for a week.
Follow that experience up with a layoff and a literally futile job search, and its hard to be optimistic about building much of a career.
I started in tech in the late 70s. I can say this break happened during the Reagan Years with a bit of help from the Nixon Years.
I have a friend of a friend in his mid 20s who finished a masters degree in data science focused on AI. There isnt a job for him and I think hes given up.
In Letters to a Young Poet Rilke responded to a young aspiring poet who asked how a person knows whether the artistic path is truly their calling:
> “There is only one thing you should do. Go into yourself. Find out the reason that commands you to write; see whether it has spread its roots into the very depths of your heart; confess to yourself whether you would have to die if you were forbidden to write. This most of all: ask yourself in the most silent hour of your night: must I write? Dig into yourself for a deep answer. And if this answer rings out in assent, if you meet this solemn question with a strong, simple "I must," then build your life in accordance with this necessity; your whole life, even into its humblest and most indifferent hour, must become a sign and witness to this impulse.”
How do I respond to this friend of a friend? Is data science or coding in general the path for you only if you would rather die than stop merging pull requests into main every day even when nobody is paying you?
Is coding the new poetry?
What do I tell this guy?
lots of "seniors" via title inflation dont have fundamentals anyways - hence a lot of broken software in the wild & also perverse incentives like Resume driven development. A.I is built on badly written open source code.
because once you have the fundamentals, built a few things - you would've battle scars which makes someone a senior
not the 'senior' we see in big corps or places cosplaying where promos are based on playing politics.
You're totally right. 10 minutes on /r/cscareerquestions (without even sorting by `top`, though it's more brutal if you do) is enough to confirm it.
I normally wouldn't cite Reddit as a source, but this same subreddit was overflowing with posts on fending off recruiters and negotiating already-sky-high comp packages just two years ago. Seeing how quickly the tables turned is sobering.
There has been a cultural shift too. I don't know when it got started, but at least employees in the tech companies started to get more and more obsessed with promotions. The so-called career development is nothing but a coded phrase for getting promoted. Managers use promotion as a tool to retain talent and to expand their territories. Companies adopted to this culture too. As a result, people development increasingly became a lip service.
Has anyone ever seen a manager mentoring ICs? I haven't. This is a senior/staff/principal responsibility.
I have been unable to get a tech job for months so I’ve looked into retraining in a new field and every single one has some up front large cost via either paying for schooling or situations like mechanics needing to bring their own tools.
The standard US company has completely shed all training costs and put the expectations on laborers to train themselves. And you’re shit out of luck if their requirements change during your training as so many college graduates who picked comp sci are currently learning
Because those senior people will NOT be around forever. And they have killed their talent development and knowledge transfer pipelines.
Either direction you take it, this feels like a lose-lose situation for everyone.
This is because "management" includes a bunch of BS that few engineers want to actually deal with. Performance discussions, 1:1s, being hauled into mandatory upper-level meetings, not actually building things anymore, etc. If it was simply pairing with juniors from time to time to hack on things and show them cool stuff, it would be wonderful.
If that were to actually happen, we'd wind up excluding many of our greatest technical performers while drowning in a sea of would-be middle managers. People skills matter, but so do many other strengths that don't always overlap with being naturally good at navigating interpersonal dynamics.
This seems like a deeply flawed take on the dual track IC-management ladder. Senior ICs don't keep plugging away by themselves because they're not managers, they just don't get people-management tasks. I think the leadership & mentorship they provide is harder than for me (a manager) because they don't have the hammer of a "manager" job title, and need to earn all their credibility. I have not had a senior IC and above in more than 10 years that didn't have a significant amount of junior & int development explicitly defined in their role, and the easiest way to get promoted is with leverage. Try and be 20% better than your peers with your contribution (hard). Make 10 people 3% better (much easier)
Interesting observation. I have personally tried to avoid getting into people manager positions (as I believed I'd be Peter Principled) but always took it as my duty to share knowledge and mentor the curious and the hungry (and even the ones that are not so). It's actually a very rewarding feeling when I hear good things about people who learned with me.
I'm quite sure I could take my cousin who has never heard the word HTML and get her to be a better programmer than the average CS graduate within 4 years of tutelage. Why is this the case? Four years is a very long time, and the universities are wasting that time. I'm certain any driven individual would end up a mid-level candidate if they skipped college and instead trained themselves for that time, especially if they hire a senior tutor for far less than the cost of the college tuition.
https://metr.org/blog/2025-03-19-measuring-ai-ability-to-com...
Its a double edged sword too. I see it in my biz -- its easier to spend 40 hours training a model how to do things the way we like rather than hire someone junior and spend a month+ on onboarding. We are noticing hitting a wall to a certain point with clients still wanting to talk to a real person, but I can see that changing in the next ~5 years. Zero idea what happens to those junior folks that used to get trained (me being one that sat through a 3mo onboarding program!).
> The most common answer from students when asked what they needed was a mentor who had just been in their shoes a few years ago, a surprising and heartening answer.
Mentoring is difficult; especially in today's world, where we are taught to despise older folks, and encouraged to treat everyone that we work with, as competitors.
For myself, I'm happily retired from the Rodent Rally, and find that LLMs have been a huge help, when learning new stuff.
Single-Payer health care would help our industry immensely if it came to pass.
Imagine having no fear any more.
Looking back, this has absolutely been the case for me personally. My first job out of school was at a startup spun off from a lab where a friend from my CS classes had been working while at school. I just referred somebody who was eventually hired that I've worked with at two other employers in the past.
Maybe chatting with a LLM with access to the codebase is equally effective as pair programming with a human. I don't have enough experience doing that yet to know. I still see it as another tool.
I've found it helps to have various levels of experience on a team. I think one reason for this is people with less experience (hopefully) ask a lot of questions to fill knowledge gaps. These conversations can lead to revisiting designs, practices, etc. and a better outcome overall.
Firstly, we've been here before, specifically in 2008. This was the real impact of the GFC. The junior hiring pipeline got decimated in many industries and never returned. This has created problems for an entire generation (ie the millenials) who went to college and accumulated massive amounts of debt for careers that never eventuated. Many of those careers existed before 2008.
The long-term consequences of this are still playing out. It's delaying life milestones like finding a partner, buying a house, having a family and generally just having security of any kind.
Secondly, there is a whole host of other industries this has affected that the author couldn't pointed to. The most obvious is the entertainment industry.
You may have asked "why do we need to wait 3 years between seasons of 8 episodes now when we used to put out 22 episodes a year?" It's a good question and the answer is this exact same kind of cost-cutting. Writers rooms got smaller and typically now the entire season is written and then it's produced when the writers are no longer there with the exception of the showrunner, who is the head writer.
So writers are rarely on set now. This was the training ground for future showrunners. Also, writers were employed for 9 months or more for the 22 episode run and now they're employed for maybe 3 months so need multiple jobs a year. Getting jobs in this industry is hard and time-consuming and the timing just may not work out.
Plus the real cost of streaming is how it destroyed residuals because Netflix (etc) are paying far fewer residuals (because they're showing their own origianl content) and those residuals sustained workers in the entertainment industry so they could have long-term careers and that experience wouldn't be lost. The LA entertainmen tindustry is in a dire state for these reasons and also because a lot of it is being offshored to further reduce costs.
Bear in mind that the old system produced cultural touchstones and absolute cash cows eg Seinfeld, Friends, ER.
Circling back, the entire goal of AI Is to displace workers and cut costs. That's it. It's no more compolicated than that. And yes, junior workers and less-skilled workers will suffer first and the most. But those junior engineers would otherwise be future senior engineers.
What I would like for people to understand that all of this is about short-term decisions to cut costs. It's no more complicated than that.
That opportunity is now lost. In a few years we will lack senior engineers because right now we lack junior engineers.
All is not lost however. Some companies are hiring junior engineers and giving them AI, and telling them to learn how to use AI to do their job. These will be our seniors of the future.
But my bigger concern is that every year the AI models become more capable, so as the "lost ladder" moves up, the AI models will keep filling in the gaps, until they can do the work of a Senior supervised by a Staff, then the work of a Staff supervised by a Principal, and so on.
The good news is that this is a good antidote to the other problem in our industry -- a lot of people got into software engineering for the money in the last few decades, not for the joy of programming. These are the folks that will be replaced first, leaving only those who truly love solving the hardest problems.
What they show is that hiring managers think they can use gen AI to get away with skipping juniors. The resulting collapse in software quality will either bite them in the ass causing a market correction, or massively enrich the big five and leave the rest of us to live with the consequences. Which outcome comes to pass is still yet to be seen (and partially under our control, as seniors)
I would much rather have that junior take some hacks at building some features with AI along with my guidance than context switching over to AI just to walk it through doing a task which means having to explain the business and our business rules over and over again.
To me cutting out a junior developer adds more time for senior developers than making their work lighter.
They do not. Mentoring is rewarding work, but it is work.
I also find it objectionable that if you're simply not interested in mentoring, you're a jerk. Some people just aren't good at it, some people are genuinely swamped with existing responsibilities, and some people might just want to focus on their goals... and that's fine. There is no but.
Some folks <gasp> just don't like other people that much, and prefer working alone. Also fine, and kudos for being self-aware enough to not inflict yourself on people who probably wouldn't enjoy your oversight either. This should be celebrated as a communications success.
All of which brings me to the truth: if a company wants to mentor junior developers - and there are many, many excellent reasons to develop talent long-term - then they should make sure that they have suitably experienced people who have opted-in to mentorship, and make sure that their success metrics and remuneration reflect the fact that a significant portion of their time is acknowledged to be dedicated to mentorship. Otherwise, you're describing a recipe for legitimate resentment.
Likewise, if you're a junior developer desperate for mentorship... I understand that your instinct is to take any offer that will have you. But if you're able to have an honest conversation with the recruiter about what kind of mentorship culture exists in a company, you might be saving everyone a lot of pain and frustration.
Another reason might be the low potential of the new generation of graduates (see lower math scores and possibly IQ). They might have interviewed some and are disappointed and unwilling to hire from this pool.
Last but not least comes the current trend to substitute humans by LLMs.
It's the bloated junior salaries that have killed their market. I never like hiring juniors, I never like working with juniors, and I'd rather pay the extra 20-30% and get someone more experienced. I'm sorry, but if you don't get into FANG, you should basically be working for nothing until you have some experience. It's cruel, it's not fair, but it's just not worth it for the employer. Especially in today's world where there is no company loyalty.
All this BS about AI taking away the stuff that juniors did, in my field, software development, that was never the case. I never worked in a place where the juniors had different work than the seniors. We all did the same things, except the juniors sucked at it, and required handholding, and it would have been faster and better if they weren't there.
The real trick is finding companies that do very simple work, simple enough that juniors can thrive on day one. It won't be the best experience, but it is experience, and the rest is what you make of it.
Instead of only funding universities, provide lower risk curves for hiring juniors where the jobs are.
The big issue is the game theory of first mover disadvantage at play.
Whoever trains the junior loses all the investment when the junior jumps ship. This creates a natural situation of every company holding until the ‘foolish ones’ (in their eyes) waste resources on training.
Second mover advantage is real. This is what the government can fix.
And then we have others claiming that AI is already having such a significant impact on hiring that the effects are clearly visible in the statistics.
They forgot to add in "Aging billionaires spend a trillion dollars on longevity research" which results in "110 year old Senior engineers still working"
Furthermore, this is why the humanities matter: because human relationships matter.
Not to mention I'm the only white person on my team other than the owner/operator. They already brought in bots of sorts from overseas. The constant drive to cheaper labor and gutting of the American middle class has been vast compared to the suffering the industry will have under junior developers using AI. It's definitely made my job easier. And I really don't care. No one cared about me. I have relatively low pay, no health insurance, and no 401K. When the last person left, management replied to his goodbye email saying he'd be replaced in a week. And then they proceeded to try to hire someone in Mexico City. Maintain the same time zone, but pay 3rd world wages and likely to have coercive control over them through desperation. Never found anyone.
I have no love for this industry or any of the "woes" it'll have with AI. Overall it's going to lead to lower wages and less jobs. For those out there producing "AI slop", I support you. It's hardly what they deserve, but they've earned it.
I didn't have the best networking skills to be fair, but I spent most of my college doing remote classes and didn't have much of a chance to network or whatever. I'm thinking about doing grad school so I can have another chance at developing some kind of network or make myself more attractive to employers. My grades were good and I genuinely enjoy computer science so spending half a year improving my portfolio sounds like a fun time. But going to grad school wouldn't really about getting employment, I just want to use my brain for something. Just working a job makes me feel completely insane, like I know that I can do so much more. I feel like I'm wasting the best years of my life and there's no place in this market where I can be useful. The only value I have is selling my body or being a human stand in for a robot at my "real" job.
Maybe this isn't the best place to post this. I have very little hope that I will ever get a job programming, and I'm just sad. What a waste of a life the past 5 years have been.
On a kind of funny note, I would say that doing sex work is genuinely less humiliating than applying to jobs as a new compsci grad. At least I have some signal that I have some value selling myself.
The continued reliance on say, COBOL, and the complete lack of those developers comes to mind.
Even before LLMs, there were periods recently where multiple companies had "senior only" hiring policies. That just inflated what "senior" was until it was basically 5 years of experience.
This time seems a bit different, however. There are both supply and demand side problems. The supply of students it tainted with AI "learning" now. Colleges haven't realized that they absolutely have to effectively crack down on AI, or the signal of their degrees will wither to nothing. The demand side is also low, of course, since the candidates aren't good, and AI seems to be a good substitute for a newly graduated hire, especially if that hire is just going to use the AI badly.
apologise for inflicting this era on them and teach them to be entrepreneurial, teach them how to build, teach them rust on the backend, teach them postgres, teach them about assets maintaining value while money loses its
tell them to never under any circumstances take on a mortgage, especially not the 50 year variety. tell them to stay at home for as long as possible and save as much as possible and put it into assets: gold, silver, bitcoin, monero
they must escape the permanent underclass, nothing else matters
We're not hiring a lot of rotary phone makers these days.
Who is hiring their own shoe-smith? It's been 30-ish years since my carpenter father last had work boots resoled.
It's almost as if... technology and economy evolve over time.
For all the arguments software people make about freedom to use their property as they see fit, they ignore non-programmers use of personal technology property is coupled to the opinions of programmers. Programmers ignore how they are middlemen of a sort they often deride as taking away the programmer's freedom! A very hypocritical group, them programmers.
What's so high tech about configuration of machines with lexical constructs as was the norm 60+ years ago? Seems a bit old fashioned.
Programmers are biology and biology has a tendency to be nostalgic, clingy, and self selecting. Which is all programmers are engaged in when they complain others won't need their skills.
As far as I hear from all projects from out customers (finance, insurances, government etc) are neither hiring juniors anymore.
In one of the meetings someone asked: when I go or retire, we won't have anyone to replace me, cause we don't hire juniors, management replied: that is not our problem to discuss.
Aka a problem for future and someone else
There is an unbounded amount of opportunity available for those who want to grab hold of it.
If you want to rely on school and get the approval of the corporate machine, you are subject to the whims of their circumstance.
Or, you can go home, put in the work, learn the tech, become the expert, and punch your own ticket. The information is freely available. Your time is your own.
Put. In. The. Work.
new grads will be fed to the meat grinder with no regards, its a closed shop unless you know someone
The general population is being rapidly sacked as a 'necessary' expense of criminal elites.
No one should be happy about this.
The article is self-serving in identifying the solutions ("do things related to the service we offer, and if that doesn't work, buy our service to help you do them better"), but it is a subject worth talking about, so I will offer my refutation of their analysis and solution.
The first point I'd like to make is that while the hiring market is shrinking, I believe it was long overdue and that the root cause is not "LLMs are takin' our jerbs", but rather the fact that for probably the better part of two decades, the software development field has been plagued by especially unproductive workers. There are a great deal of college graduates who entered the field because they were promised it was the easiest path to a highly lucrative career, who never once wrote a line of code outside of their coursework, who then entered a workforce that values credentialism over merit, who then dragged their teams down by knowing virtually nothing about programming. Productive software engineers are typically compensated within a range of at most a few hundred thousand dollars, but productive software engineers generally create millions in value for their companies, leading to a lot of excess income, some of which can be wasted on inefficient hiring practices without being felt. This was bound for a correction eventually, and LLMs just happened to be the excuse needed for layoffs and reduced hiring of unproductive employees[1].
Therefore, I believe the premise that you need to focus entirely on doing things an LLM can't -- networking with humans -- is deeply faulty. This implies that it is no longer possible to compete with LLMs on engineering merit, and I could not possibly disagree more. Rather than following their path forward, which emphasises only networking, my actual suggestion to prospective junior engineers is: build things. Gain experience on your own. Make a portfolio that will wow someone. Programming is a field that doesn't require apprenticeship. There is not a single other discipline that has as much learning material available as software development, and you can learn by doing, seeing the pain points that crop up in your own code and then finding solutions for them.
Yes, this entails programming as a hobby, doing countless hours of unpaid programming for neither school nor job. If you can't do that much, you will never develop the skills to be a genuinely good programmer -- that applied just as much before this supposed crisis, because the kind of junior engineer who never codes on their own time was not being given the mentorship to turn into a good engineer, but rather was given the guidance to turn them into a gear that was minimally useful and only capable of following rote instructions, often poorly. It is true that the path of the career-only programmer who goes through life without spending their own time doing coding is being closed off. But it was never sustainable anyways. If you don't love programming for its own sake, this field is not likely to reward you going forward. University courses do not teach nearly effectively enough to make even a hireable junior engineer, so you must take your education into your own hands.
[1] Of course, layoff processes are often handled just as incompetently as hiring processes, leading to some productive engineers getting in the crossfire of decisions that should mostly hurt unproductive engineers. I'm sympathetic to people who have struggled with this, but I do believe productive engineers still have a huge edge over unproductive engineers and are highly likely to find success despite the flaws in human resource management.
Good luck with causation/correlation vs the rise of LLM.
It wasn't too long ago that it was common to read threads on HN and other tech fora about universities graduating software engineers seriously lacking coding skills. This was evidenced by often-torturous interview processes that would herd dozens to hundreds of applicants through filters to, among other things, rank them based on their ability to, well, understand and write software.
This process is inefficient, slow and expensive. Companies would much rather be able to trust that a CS degree carries with it a level of competence commensurate with what the degree implies. Sadly, they cannot, still, today, they cannot.
And so, the root cause of the issue isn't AI or LLM's, it's universities churning people through programs and granting degrees that often times mean very little other than "spent at least four years pretending to learn something".
If you are thinking that certain CS-degree-granting universities could be classified as scams, you might be right.
And so, anyone with half a braincell, will, today, look at the availability of LLM tools for coding as a way to stop (or reduce) the insanity and be able to get on with business without having to deal with as much of the nonsense.
Nobody here makes a product or offers a service (hardware, software, anything) for the love of the art. We make things to solve problems for people and services. That's why you exists. Not to look after a social contract (as a comment suggested). Sorry, that's nonsense. The company making spark plugs makes spark plugs, they are not on this planet to support some imaginary public good. Solving the problem is how they contribute.
And, in order to solve problems, you need people who are capable of deploying the skills necessary to do so. If universities are graduating people who can barely make a contribution to the mission at hand, companies are going to always look for ways to mitigate that blocking element. Today, LLM's are starting to provide that solution.
So it isn't about greed or some other nonsense idealistic view of the universe. If I can't hire capable people, I will gladly give senior engineers more tools to support the work they have to do.
As is often the case, the solution to so many problems today --including this one-- is found in education. Our universities need to be setup to succeed or fail based on the quality of the education they deliver. This has almost never been the case. Which means you have large scale farming operations granting degrees that can easily be dwarfed by an LLM.
And don't think that this is only a problem a the entry level. I recently worked with a CTO who, to someone with experience, was so utterly unqualified for the job it was just astounding that he had been give the position in the first place. It was clearly a case of him not knowing just how much he didn't know. It didn't take much to make the case for replacing him with a qualified individual or risk damage to the company's products and reputation going forward.
A knowledgeable entry-level professional who also has solid AI-as-a-tool skills is invaluable. Note that first they have to come out of university with real skills. They cannot acquire those after the fact. Not any more.
NOTE: To the inevitable naive socialist/communist-leaning folks in our mix. Love your enthusiasm and innocence, but, no, companies do not exist to make a profit. Try starting one for once in your naive life with that specific mission as your guiding principle and see how far you'll get.
Companies succeed by solving problems for people and other companies. Their clients and customers exchange currency for the value they deliver. The amount they are willing to pay is proportionate to the value of the problem being solved as perceived by the customer --and only the customer.
Company management has to charge more than the mere raw cost of the product or service for a massive range of reasons that I cannot possibly list here. A simple case might be having to spend millions of dollars and devote years (=cost) to creating such solutions. And, responsible companies, will charge enough to be able to support ongoing work, R&D, operations, etc. and have enough funds on hand to survive the inevitable market downturns. Without this, they would have to let half the employees go every M.N years just because of natural business cycles.
So, yeah, before you go off talking about businesses like you've never started or ran a non-trivial anything (believe me, it is blatantly obvious when reading your comments), you might want to make an attempt to understand that your stupid Marxists professors or sources had absolutely no clue, were talking out of their asses, never started or ran a business, and everything they pounded into your brains fails the most basic tests with objective, on-the-ground, skin-in-the-game reality.