What's fascinating is that this value elevation seems to have gone largely unchallenged, despite being in essence an arbitrary value choice. Other choices are possible. And, I hope, still are possible, despite what the bigcorps declare must be the case in order to maximize shareholder returns.
I think that really high quality code can be created via coding agents. Not in one prompt, but instead an orchestration of planning, implementing, validating, and reviewing.
Its still engineering work. The code still matters. Its just a different tool to write the code.
I'd compare the difference between manually coding and operating a coding agent to the difference between a handsaw and a chainsaw - the end result is the same but the method is very different.
Rumor has it there were a few elite crafters among the lot. Software wizards who pondered about systems and architecture as they had a $10 espresso macchiato.
When writing software for yourself, there is a bias towards implementing just the features you want and never mind the rest. Sometimes the result can be pretty sloppy, but it works.
However, code health is a choice. You just need to know what to ask for. A coding agent can be used as a power washer to tidy up a project. This won't result in great art, but like raking leaves or cleaning your steps or plowing a driveway, it can be satisfying.
Just as you wouldn't use a power washer to clean a painting, maybe there's some code that's too delicate to use a coding agent on? But for a project that has good tests and isn't that delicate, which I believe includes most web apps, nobody's going to want to pay for you to do it by hand anymore. It would be like paying someone to clear the snow in a parking lot with a shovel rather than hiring someone with a plow.
We seem to take everything for granted now and forget what real engineering is like.
This review is 13 years old by itself:
https://www.theguardian.com/books/2012/mar/25/turings-cathed...
The standard: Forth words should be a few lines of code with a straightforward stack effect. Top level of a program might be 5 words.
LLM will generate some subroutines and one big word of 20-50 lines of nested IF..THEN..ELSE and DO..WHILE just as if it writing C.
I remember when I first started out programming 20 years ago, there was time to craft good quality code. Then there were more and more pushes to get more code out faster, and no one really cared about the quality. Bugs became part of the cost of doing business. I think GenAI for code fits well in the more recent paradigm, and comparing it with hand-crafted code of yore is a bit disingenuous, as appealing as it may be, because most code hasn't been that good for a long time.
I am sad to admit it, but AI is just fitting in where poor coding practices have already existed, and flourishing in that local maxima.
A nice example is guitar building: there's a whole bunch of luthiers that stick to traditional methods to build guitars, or even just limit themselves to japanese woodworking tools.
But that is not the only way to build great guitars. It can be done by excellent luthiers, building high quality quitars with state of the art tools. For example Ulrich Teuffel who uses all sorts of high tech like CAD systems and CDC machines to craft beautiful guitars: https://www.youtube.com/watch?v=GLZOxwmcFVo and https://www.youtube.com/watch?v=GLZOxwmcFVo
Unfortunately, craftsmanship does not come cheap, so most customers will turn to industrially created products. Same for software.
That's the problem.
I think one promising shift direction is humans do NOT like to talk to bots, especially not for anything important. It's biological. We evolved to learn from and interact with other humans, preferably the same group over a long time, so we really get to understand/mirror/like/support each other.
Unfortunately for the general populace, most technological improvements in information technology, for the past 5 decades, has lead to loss of political control and lessened their leverage for political change.
With AI, this change is going to be accelerated a 100 times.
With current AI slop, and more importantly, almost indistinguishable from reality, AI based content, the populace is going slowly learning to reject what they see and what they hear from mass media.
AI has muddied the pool so much, that every fish, us, cannot see the whole pool. What this will lead to, is for political figures and bad actors to, much more easily almost with no effort at all, create isolation among people.
No event will create a mass uprising, because no event can be believed by a common mass. It will be easy to generate an alternative reality using the same AI.
Now, the political class and the billionaire class, are free to act with impunity, because the last check on their power, the power of mass media to form public opinion, to inspire the masses to demand change or accountability, has eroded to the point of no return. (They have already captured the institutions of public power)
I fear for the future of humanity.
Edit : There are already troubling signs from the billionaire class regarding this. There is a narrative to "ensure guardrails" for AI, sort of giving the populace the idea that once that is done, AI is acceptable. This is like saying, "better have a sleeve on the knife, so that no one can cut with it, but use it as a prop in a movie"
They are creating this narrative that AI is inevitable.
They are fear mongering that AI is going to take jobs, which it will, but it also goads the capable ones to get on to the bandwagon and advance AI further.
I have recently started exploring AI-coding -- note that I said AI coding and not vibe-coding because that is for the brain-dead.
By AI coding, I mean I know the inputs, outputs, structures the code should have and the necessary context to write the code. Then, articulating the requirements in English as best as I can and feeding it to agents.
Needless to say, the code is pathetic; it chooses to implement meaningless abstractions even after explicitly providing the design and plan to follow.
I don’t understand how we, as a collective species, agreed to believe the criminally wrong lies of tech CEOs that, instead of implementing a “reliable system by hand,” we choose to convey our ideas and instructions in an “ambiguous,” “inconsistent,” and “context-dependent” language (English), which is then passed through a probabilistic system to generate the reliable system.
As for lies and bad code, it didn't appear with AI. Humans lied and produced bad code before AI.
How does the author empirically know AI does not understand? And if it does not understand right now, is a machine fundamentally unable to understand? Is understanding an exclusive human ability? Is it because machines lack a soul? It sounds quite dualistic (Descartes'view that mind and body and fundamentally different).
Don't get me know, I think right now, AI is less a good at understanding humans than other humans (or even dogs) in many contexts because it has no access to non verbal signals. But in the context of building software, it is good enough and I don't see why a machine should not be able to understand humans.
Especially with my rules:
- Prefer simple, boring solution
- Before adding complexity to work around a constraint, ask if the constraint needs to exist.
- Remember: The best code is often the code you don't write.
AI is just the next example, and the internet is particularly troublesome at filtering up the most tacky.
Always remember the broadly positive aspects remain, they're simply obscured by dusty clouds from the bandwagon.
Use tools to expand your capability.
> AI code [..] may also free up a space for engineers seeking to restore a genuine sense of craft and creative expression
This resonates with me, as someone who joined the industry circa 2013, and discovered that most of the big tech jobs were essentially glorified plumbers.
In the 2000s, the web felt more fun, more unique, more unhinged. Websites were simple, and Flash was rampant, but it felt like the ratio of creators to consumers was higher than now.
With Claude Code/Codex, I've built a bunch of things that usually would die at a domain name purchase or init commit. Now I actually have the bandwidth to ship them!
This ease of dev also means we'll see an explosion in slopware, which we're already starting to see with App Store submissions up 60% over the last year[0].
My hope is that, with the increase of slop, we'll also see an increase in craft. Even if the proportion drops, the scale should make up for it.
We sit in prefab homes, cherishing the cathedrals of yesteryear, often forgetting that we've built skyscrapers the ancient architects could never dream of.
More software is good. Computers finally work the way we always expected them to!
[0]https://www.a16z.news/p/charts-of-the-week-the-almighty-cons...
I've been using AI coding tools heavily for the past year. They're genuinely useful for the "plumbing" - glue code, boilerplate, test scaffolding. But where they consistently fail is reasoning about system-level concerns: authorization boundaries, failure modes, state consistency across services.
The article mentions AI works best on "well-defined prompts for already often-solved problems." This is accurate. The challenge is that in production, the hard problems are rarely well-defined - they emerge from the interaction between your code and reality: rate limits you didn't anticipate, edge cases in user behavior, security assumptions that don't hold.
Craft isn't about writing beautiful code. It's about having developed judgment for which corners you can't cut - something that comes from having been burned by the consequences.
Popular music tends to be generic. Popular content is mostly brainrot these days. Popular software is often a bloated mess because most users’ lives don’t revolve around software. They use software to get something done and move on.
I never understood the appeal of “craft” in software. Early computer pioneers were extremely limited by the tech of their time, so the software they hacked together felt artsy and crafty. Modern software feels industrial because it is industrial - it’s built in software factories.
Industrial software engineers don’t get paid to do art. There are research groups that do moonshot experiments, and you can be part of that if it’s your thing. But lamenting the lack of craft in industrial software is kind of pointless. Imagine if we’d stopped at crafty, handmade auto engines and never mass-produced them at scale. We don’t lament “crafty engines” anymore. If you want that, go buy a supercar.
Point is: AI is just another tool in the toolbox. It’s like Bash, except calling it that won’t pull in billions of dollars in investment. So “visionaries” call it ghost in the machine, singularity, overlord, and whatnot. It produces mediocre work and saves time writing proletariat software that powers the world. Crafty code doesn’t pay the bills.
But I’m not saying we shouldn’t seek out fun in computing. We absolutely should. It’s just that criticizing AI for not being able to produce art is an old thing. The goalpost keeps shifting, and these tools keep crushing it.
I don’t use AI to produce craft, because I don’t really do craft in software - I have other hobbies for that. But I absolutely, proudly use it to generate mediocre code that touches millions of people’s lives in some way.