This, but also for code. I just don't trust new code, especially generated code; I need time to sit with it. I can't make the "if it passes all the tests" crowd understand and I don't even want to. There are things you think of to worry about and test for as you spend time with a system. If I'm going to ship it and support it, it will take as long as it will take.
AI is a general-purpose tool, but that doesn't mean best-practices and wisdom are generalizable. Web dev is different than compilers which is different than embedded and all the differences of opinion in the comments never explain who does what.
That said, I would take this up a notch:
> If you ask AI to write a document for you, you might get 80% of the deep quality you’d get if you wrote it yourself for 5% of the effort. But, now you’ve also only done 5% of the thinking.
Writing _is_ the thinking. It's a critical input in developing good taste. I think we all ought to consider a maintenance dose. Write your own code without assistance on whatever interval makes sense to you, otherwise you'll atrophy those muscles. Best-practices are a moving train, not something that you learned once and you're done.
On the verification loop: I think there’s so much potential here. AI is pretty good at autonomously working on tasks that have a well defined and easy to process verification hook.
A lot of software tasks are “migrate X to Y” and this is a perfect job for AI.
The workflow is generally straightforward - map the old thing to the new thing and verify that the new thing works the same way. Most of this can be automated using AI.
Wanna migrate codebase from C to Rust? I definitely think it should be possible autonomously if the code base is small enough. You do have to ask the AI to intelligently come up with extensive way to verify that they work the same. Maybe UI check, sample input and output check on API and functionality check.
* The interface is near identical across bots
* I can switch bots whenever I like. No integration points and vendor lock-in.
* It's the same risk as any big-tech website.
* I really don't need more tooling in my life.
> Will AI replace my job?
> If you consider your job to be “typing code into an editor”, AI will replace it (in some senses, it already has). On the other hand, if you consider your job to be “to use software to build products and/or solve problems”, your job is just going to change and get more interesting.
Waiting until patterns stabilize, better UX, clearer failure modes, and community best practices, tends to give a much better long-term payoff.
This is better because I use my own test as a forcing function to learn and understand what the AI has done. Only after primary testing might I tell it to do checking for itself.
Real quote
> "Hence their value stems from the discipline and the thinking the writer is forced to impose upon himself as he identifies and deals with trouble spots in his presentation."
I mean seriously?
As a former local banker in Japan who spent decades appraising the intangible assets of businesses that have survived for centuries, I’ve learned that true mastery is found in stability, not novelty. In an era of rapid AI acceleration, the real risk is gambling your institutional reputation on unproven, volatile tools.
By 2026, when every “How” is a cheap commodity, the only thing that commands a premium is the “Why”—the core of human judgment. Staying a step behind the hype allows you to keep your hands on the steering wheel while the rest of the market is consumed by the noise. Stability is the ultimate luxury.