I would be more excited about this concept if this shared brain wasn't owned by rich, powerful people who will most certainly deploy this power in ways that benefit themselves to the detriment of everyone else.
I agree that we're not about to be all replaced with AGI, that there is still a need for junior eng, and with several of these points.
None of those arguments matter if the C suite doesn't care and keeps doing crazy layoffs so they can buy more GPUs, and intelligence and rationality are way less common than following the cargo cult among that group.
The problem is that only a fraction of software developer have the ability/skills to work on the hard problems. A much larger percentage will only be able to work on things like CRUD apps and grunt work.
When these jobs are eliminated, many developers will be out of work.
Yeah, no that is always promised but historically this has never been true. On the contrary. Every technical revolution ever has brought great short term suffering to the majority and in the long term served to alienate people from their work.
It has been creative people, translator, writers, editor, artists, musicians who most fear to lose their jobs due to generative AI.
What is more fulfilling? Creating had crafted items or just being a cog in the assembly line? Writing your own code or micromanaging an AI?
Doesn't mean that progress is inherently bad but that it is a political question. Will the productivity gains allow us to work less and enjoy life more or will they make rich people more rich? Currently rich people are winning but the wind can change.
Personally, I think this whole shift might actually be better for young people early in their careers. They can change direction more easily, and in a weird way, AI kind of puts everyone back at the starting line. Stuff that used to take years to master, you can now learn—or get help with—in minutes. I’ve had interns solve problems way faster and smarter than me just because they knew how to use AI tools better. That’s been a real wake-up call.
I’m doing my best to treat AI as a teammate. It really does help with productivity. But the world never stop changing, and that’s exhausting sometimes. I try to keep learning, keep practicing, and keep adapting. And yeah, if I ever lose my job because of AI... ok, fine, I’ll have to change and try getting another, maybe different job. Easy to say, harder to do—but that mindset at least helps me not spiral.
Financial leverage: It is cheap to write code, experiments are inexpensive. A million dollars in funding for a B2B SaaS gives many more shots on goal vs a million dollars in funding for drug research or manufacturing. This increases probability of ROI and permits aggressive investment.
Operational leverage: Scaling code is cheap as well. It is free to copy. Solving one problem with software well often enables solving immediately adjacent problems very cheaply.
Do LLMs decrease or increase the leverage here?
Writing code is cheaper, a single engineer can now do much more. Does that endanger engineers? Yes, if their job is "take requirements and implement to spec". No, if their job is "solve important business problems at scale". The former are already typically not valued or paid exceptionally highly. The latter are likely to be valued and paid even more than they already are.
Or put another way, if software engineers are going to be hurt by LLMs who is going to benefit? This is assuming a zero sum game which I would disagree with here. But if not software engineers than who is better positioned to wield LLMs effectively?
"Shared" as in shareholder?
"Shared" as in piracy?
"Shared" as in monthly subscription?
"Shared" as in sharing the wealth when you lose your job to AI?
I’m looking at this as the landscape of the tools is changing, so personally anyway, I just keep looking for ways to use those tools to solve problems / make peoples lives easier (by offering new products etc). It’s an enabler rather than a threat, once the perspective is broadened, I feel.
That being said, AI and I have written some amazing programs to produce beautiful graphics I use on-the-air. And it's all in Python, a language I can read and not write.
If I can do this you should be scared.
It went round in circles doubting itself. When I challenged it, it would redo or undo too much of its work instead of focussing on what I'm asking it about. It seemed to be desperate to please me, backing down to my challenging it.
Ie depending on it turned me into a junior coder. Overly complex code, jumping to code without enough thought etc
Yes yes I'm holding it wrong
The code they create seems to be creating a mess that also is solved by AI. Huge sprawling messes. Funny that. God help us if we need to clear up these messes if AI dies down
In the short term yes, but we're already seeing nearly autonomous agents get impressive results. It won't be very long until the average person can be that guiding hand, rather than a software engineer who knows how to code by hand and design software. This is good for the world, terrible for the software dev
I just wish they were forced to publish open weights in return for using copyrighted materials in the "brain".
Back 200 years ago, most people were illiterate. If you could read and write, there was work. You could become part of the machinery of the beginning industrial revolution, actually quite an important cog. Just by knowing how to read and write, someone would need your skills to coordinate stuff, mostly mundane. But it meant you had a way into a business. You might convert your clerkship into accountancy or law, or you would become a manager, knowledgeable about whatever business you were working in.
As time passed, everyone became literate. Knowing how to read and write stopped being the only thing you needed to get on the career ladder.
When I started working, my boss had no degree. He had energy, and he could do arithmetic. This got him a job as a young man running around with slips of paper in the LIFFE pit. Eventually he learned how option trading worked.
He got older and hired me. By this time, you could easily find a highly numerate graduate, and only such people were considered for finance roles. It was enough to have an Oxbridge degree and just sort of be smart enough to figure out coding on the job.
Now, when I look at the new grads, they blow me away. They can already code quite well. They already have internships in the business. They already have an idea of what alpha is, and how to find it. They are well on their way to just being quantitative trading professionals.
We are in an interim period similar to the expansion of literacy. The school system has not ramped up computer literacy in the way it successfully got most kids to be able to read and write.
Until there are lots of people able to code, there will be lots of programming jobs. That is, jobs where the person is in the seat because they can code. Much as in 1825, there were clerking jobs for guys who could read and write.
Or so we thought.
Now there is a tool that allows the business side to make code. It's not even that terrible code in my opinion, and it will only get better. It's here, and if you know what the business needs, you can use it to further the business goals.
The great divide that will open up is that developers who got into business because they could code are now in a bit of a wonderland. They not only know what code is needed, they can implement it without their friends who are further down the chain.
People who are just finishing a course in how to code, well, they face a bit of a struggle. On one hand, it's an important skill. On the other hand, for that skill to pay, you need to jump the gap that was once a stable existence. It might not be its own skill any longer, you might need the domain knowledge on a whole higher level.
It's not only AI, it's rampant ageism as well where you are old late 20s for this field. It's a hype driven, youth obsessed career.