What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.
It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
Physical goods like clothes or cars have variable costs. The marginal unit always costs > 0, and thus the price to the consumer is always greater than zero. Industrialization lowered this variable cost, while simultaneously increasing production capacity, and thus enabled a new segment of "low cost, high volume" products, but it does not eliminate the variable cost. This variable cost (eg. the cost of a hand made suit) is the "umbrella" under which a low cost variant (factory made clothes) has space to enter the market.
Digital goods have zero marginal cost. Many digital goods do not cost anything at all to the consumer! Or they are as cheap as possible to actively maximize users because their costs are effectively fixed. What is the "low value / low cost" version of Google? or Netflix for that matter? This is non-sensical because there's no space for a low cost entrant to play in when the price is already free.
In digital goods, consumers tend to choose on quality because price is just not that relevant of a dimension. You see this in the market structure of digital goods. They tend to be winner (or few) take all because the best good can serve everyone. That is a direct result of zero marginal cost.
Even if you accept the premise that AI will make software "industrialized" and thus cheaper to produce, it doesn't change the fact that most software is already free or dirt cheap.
The version of this that might make sense is software that is too expensive to make at all because the market size (eg. number of consumers * price they would pay) is less than the cost of the software developer / entrpreneurs time. But by definition those are small markets, and not anything like the huge markets that were enabled by physical good industrialization.
Maybe there'll be an enormous leap again but I just don't quite see the jump to how this gets you to 'industrial' software. It made it a lot faster, don't get me wrong, but you still needed the captain driving the ship.
Take this for example:
``` Industrial systems reliably create economic pressure toward excess, low quality goods. ```
Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.
Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.
I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand. And this was long before any "big data" thing.
Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
People use software for specific features, but most software have lots of features people never use or need. A lot of modern software is designed to handle lots of users, so they need to be scalable, deployable, etc.
I don't need any of that. I just need the tool to do the thing I want it to do. I'm not thinking about end users, I just need to solve my specific problem. Sure there might be better pieces of software out there, which do more things. But the vibe coded thing works quite well for me and I can always fix it by prompting the model.
For example, I've vibe coded a tool where I upload an audio file, the tool transcribes it and splits it into 'scenes' which I can sync to audio via a simple UI and then I can generate images for each scene. Then it exports the video. It's simple, a bit buggy, lacks some features, but it does the job.
It would have taken me weeks to get to where I am now without having written one manual line of code.
I need the generated videos, not the software. I might eventually turn it into a product which others can use, but I don't focus on that yet, I'm solving my problem. Which simplifies the software a lot.
After I'm finished with this one, I might generate another one, now that I know exactly what I want it to do and what pitfalls to avoid. But yeah, the age of industrial software is upon us. We'll have to adapt.
The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.
We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
The mass production of unprocessed food is not what led to the production of hyper processed food. That would be a strange market dynamic.
Shareholder pressure, aggressive marketing and engineering for super-palatable foods are what led to hyper processed foods.
In fact, getting out of the two quadrant mindset, and seeking the third, is part of the learning process for developing modern industrial products. And where I work, adjacent to a software development department, I think the devs are aware of this as well. They wanted to benefit from further automation -- the thing I think they're coping with is that it seems to be happening so quickly.
I wonder if this will lead to more "forks for one person" where you know of open source software that's close to what you want, except for one thing, so you point a coding agent at it.
High-level languages are about higher abstractions for deterministic processes. LLMs are not necessarily higher abstractions but instead about non-deterministic processes, a fundamentally different thing altogether.
- Tailored suit: This is a high-cost and high-value thing. Both quality and fit are much better than fast-fashion.
In the similar sense, maybe LLMs will produce the frameworks or libraries in the future. Akin to the "fabric" used by the tailors. But at the end, craftsmen or women are the ones architecting and stitching these together.
Verbatim [1]:
> Will traditional software survive?
> Ultraprocessed foods are, of course, not the only game in town. There is a thriving and growing demand for healthy, sustainable production of foodstuffs, largely in response to the harmful effects of industrialisation. Is it possible that software might also resist mechanisation through the growth of an “organic software” movement? If we look at other sectors, we see that even those with the highest levels of industrialisation also still benefit from small-scale, human-led production as part of the spectrum of output.
> For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion. And yet handcrafted clothes still exist: from tailored suits to knitted scarves, a place still exists for small-scale, slow production of textile goods, for reasons ranging from customisation of fit, signalling of wealth, durability of product, up to enjoyment of the craft as a pastime.There is a difference between writing for mainstream software and someone's idea/hope for the future.
Software that is valued high enough will be owned and maintained.
Like most things in our world, I think ownership/stewardship is like money and world hunger, a social issue/question.
“The difference I return to again and again isn’t tech depth. It’s constraints.”
"Rough framework I’m using lately:"
Consumer software aims at maximizing joy.
“Enterprise software is all about coordination.”
"Industrial software operates in a environment of the real-world "mess", yet its
"Industrial stuff appears to be more concerned with:
a.
failure modeslong-term maintenance
predictable behavior vs cleverness
But as soon as software is involved with physical processes, the tolerance for ambiguity narrows quickly.
Curious how others see it:
What’s your mental line between enterprise and industrial? What constraints have affected your designing? “Nice abstractions.” Any instances where these failed the test of reality?
The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.
So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
something "simple" as reverse ETL - a lot of value is locked within that - & you can even see players such as Palantir etc trying to bring unified data view with a fancy name etc
it's also the same reason Workday, Salesforce etc charge a lot of money
low cost/low value software tagged as disposable usually means development cost was low, but maintenance cost is high ; and that's why you get rid of it.
On the other hand, the difference between good and bad traditional software is that, while cost is always going to be high, you want maintenance cost to be low. This is what industrialization is about.
This paper about LLM economics seems relevant:
https://www.nber.org/papers/w34608
Quote: "Fifth, we estimate preliminary short-run price elasticities just above one, suggesting limited scope for Jevons-Paradox effects"
I am not threatened by LLMs. I would like it if I could code purely in requirements. But every time I get frustrated and just do it myself, because I am faster.
This whole article was interesting, but I really like the conclusion. I think the comparison to the externalized costs of industrialization, which we are finally facing without any easy out, is a good one to make. We've been on the same path for a long time in the software world, as evidenced by the persistent relevance of that one XKCD comic.
There's always going to be work to do in our field. How appealing that work is, and how we're treated as we do that work, is a wide open question.
This sounds weird, or wrong. Does anonymous stats need cookies at all?
First the core of the argument that 'Industrialization' produces low quality slop is not true - industrialization is about precisely controlled and repeatable processes. A table cut by a CNC router is likely dimensionally more accurate than one cut by hand, in fact many of the industrial processes and machines have trickled back into the toolboxes of master craftsmen, where they increased productivity and quality.
Second, from my experience of working at large enterprises, and smaller teams, the 80-20 rule definitely holds - there's always a core team of a handful of people who lay down the foundations, and design and architect most of the code, with the rest usually fixing bugs, or making bullet point features.
I'm not saying the people who fall into the 80% don't contribute, or somehow are lesser devs, but they're mostly not well-positioned in the org to make major contributions, and another invariable aspect is that as features are added and complexity grows, along with legacy code, the effort needed to make a change, or understand and fix a bug grows superlinearly, meaning the 'last 10%' often takes as much or more effort than what came before.
This is hardly an original observation, and in today's ever-ongoing iteration environment, what counts as the last 10% is hard to define, but most modern software development is highly incremental, often is focused on building unneeded features, or sidegrade redesigns.
Oh wait. It is already a thing.
I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.
Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
I use pointy-clicky software for visualizing relationships and correlations. I use draggy droppy software for building complex data workflows (including ML elements). I use desktop publishing software rather than LaTEX. I suffer industrial interfaces for (most of) my own software (where I am the chief customer) because it's easy for me to sling simple server-side UIs, but there are standalone servers out there which make creating apps (with RFID, QR, accellerometer support) just like, yes JUST like, desktop publishing (especially as you get closer to industrial control applications); one of my faves has a widget which is a "choose your own adventure" dashboard widget so that the users can create a dashboard customized just for them, yes, Inception (granted, that widget does need some configuration which requires a keyboard).
Granted, behind every one of those slick interfaces is a drippy gob of data. I have to create a CSV in the correct format for the visualizer, or surrender to their integrated "partner solutions"; or for a different visualizer I have to create some network services which it consumes for "enrichment". My data workflow tool has generic python "actions" so you can create custom tasks (it presents the data to your scripts using pandas). On very rare occasions I have a need to hack on DTP docs to format obscene amounts of repetitive data; but a lot of the time it's back to making a CSV and putting that in a spreadsheet / database which the software can then utilize for quaintly-named "mail merge". The UI software which I refer to integrates with SQL databases and authorization / access management engines, somebody still needs to set those up; and I stumbled across it in the first place because somebody needed a little (surprisingly little) help connecting to an obscure HTTP resource.
To short circuit a bunch of off track commentary: LLMs are not speaking english to other LLMs to design new LLMs AFAIK. I have never seen a debate or article about whether it is better for LLMs to utilize english or russian for this task. I just don't like the menu that is offered on this ship Titanic, and I'm uncomfortable with the majority of the passengers being incarcerated belowdecks; I'll book different passage, thanks.
The following is just disingenuous:
>industrialisation of printing processes led to paperback genre fiction
>industrialisation of agriculture led to ultraprocessed junk food
>industrialisation of digital image sensors led to user-generated video
Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.
The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.
>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.
This just is not true and goes against all available evidence, as well as basic economics.
>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.
This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.
Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
One of the things that happened around 2010, when we decided to effect a massive corporate change away from both legacy and proprietary platforms (on the one hand, away from AIX & Progress, and on the other hand, away from .Net/SQL Server), was a set of necessary decisions about the fundamental architecture of systems, and which -- if any -- third party libraries we would use to accelerate software development going forward.
On the back end side (mission critical OLTP & data input screens moving from Progress 4GL to Java+PostgreSQL) it was fairly straightforward: pick lean options and as few external tools as possible in order to ensure the dev team all completely understand the codebase, even if it made developing new features more time consuming sometimes.
On the front end, though, where the system config was done, as well as all the reporting and business analytics, it was less straightforward. There were multiple camps in the team, with some devs wanting to lean on 3rd party stuff as much as possible, others wanting to go all-in on TDD and using 3rd party frameworks and libraries only for UI items (stuff like Telerik, jQuery, etc), and a few having strong opinions about one thing but not others.
What I found was that in an organization with primarily junior engineers, many of which were offshore, the best approach was not to focus on ideally "crafted" code (I literally ran a test with a senior architect once where he & I documented the business requirements completely and he translated the reqs into functional tests, then handed over the tests to the offshore team to write code to pass. They didn't even mostly know what the code was for or what the overall system did, but they were competent enough to write code to pass tests. This ensured the senior architect received something that helped him string everything together, but it also meant we ended up with a really convoluted codebase that was challenging to holistically interpret if you hadn't been on the team from the beginning. I had another architect, who was a lead in one of the offshore teams, who felt very strongly that code should be as simple as possible: descriptive naming, single function classes, etc. I let him run with his paradigm on a different project, to see what would happen. In his case, he didn't focus on TDD and instead just on clearly written requirements docs. But his developers had a mix of talents & experience and the checked-in code was all over the place. Because of how atomically abstract everything was, almost nobody understood how pieces of the system interrelated.
Both of these experiments led to a set of conclusions and approach as we moved forward: clearly written business requirements, followed by technical specifications, are critical, and so is a set of coding standards the whole group understands and has confidence to follow. We setup an XP system to coach junior devs who were less experienced, ran regular show & tell sessions where individuals could talk about their work, and moved from a waterfall planning process to an iterative model. All of this sounds like common sense now that it's been standard in the tech industry for an entire generation, but it was not obvious or accepted in IT "Enterprise Apps" departments in low margin industries until far more recently.
I left that role in 2015 to join a hyperscaler, and only recently (this year) have moved back to a product company, but what I've noticed now is that the collaborative nature of software engineering has never been better ... but we're back to a point where many engineers don't fully understand what they're doing, either because there's a heavy reliance on code they didn't write (common 3P libraries) or because of the compartmentalization of product orgs where small teams don't always know what other teams are doing, or why. The more recent adoption of LLM-accelerated development means even fewer individuals can explain resultant codebases. While software development may be faster than ever, I fear as an industry we're moving back toward the era of the early naughts when the graybeard artisans had mostly retired and their replacements were fumbling around trying to figure out how to do things faster & cheaper and decidedly un-artisanally.