I find myself wondering what to do with this article and the incredibly well-condensed collection of sentiments therein. Do I drop a furtive link in that #ai-enthusiasts Slack channel, in full view of everybody, including that CTO of mine who is REALLY excited about this stuff, and isn't quite ready to issue an AI usage mandate? Or do I keep this secreted away, like a bible in my left jacket pocket in case one of "their" sharpshooters attempts to assassinate me with an AI silver bullet, as though I were some wild and reckless heathen that seeks to abandon all good sense in the selfish pursuit of personal pride?
I don't mean to be hyperbolic, but I never envisioned this moment of feeling bizarre, feckless and iconoclast for wanting to do what feels like the reasonable thing. The circular finances propping this stuff up don't add up. Do I really want to be left dependent on a thing that only really feels like it just flashed into precarious existence, that will almost without a doubt be propped up, classed as "too big to fail" by one particular nation-state, whose actual intent and interests are to subjugate its subjects into a quivering oblivion when it's finally acknowledged the whole thing is in arrears? No, goddammit, I shouldn't have to accept this barrage against and wholesale weakening of my strongest assets: My mind and my discipline. At least that's how I feel about it anyway.
I do wonder the prospects of any etsy-like outcome for largely hand crafted software though. While you can personally find stylistic expression in the craft i'm not sure how apparent the nuances of crafting code is to users of the product beyond the requirements of a UX design and vision. It's hard not to imagine generation industrializing a lot of this part of the craft of making software.
For me I think the important thing to not lose sight as we use generation more and more in software is our care for the work piece. It feels like care, and deep understanding are set up to become further valuable rarities in the future as we become less and less intimately involved and we have to be intentional about in order to keep.
I feel like there is some parallels here to industrial designers and their desire to hold on to obsessing about and understanding the details in the face of using industrialized tooling and being very much removed from the intimate feeling of crafting every millimetre. Deeply caring is still meaningful and valuable even if it isn't minimally required.
Maybe the arguing is really over whether it's higher-status to enjoy longform content, or to criticize it for not being more efficient? By identifying the argument, I've revealed it as silly, and clearly proven myself to be higher status than either side. The arguing may stop now. You're welcome.
There's a flaw in the Milli Vanilli argument. The band had no input into their songs. They 'performed' them by lip-syncing on stage, but all of the music and lyrics were someone elses. Milli Vanilli had no part in the creative process.
That's not technically true of AI content. There's some tiny little seed of a creative starting point in the form of a prompt needed for AI. When someone makes something with Claude or Nano Banana it's based on their idea, with their prompt, and their taste selecting whether the output is an acceptable artefact of what they wanted to make. I don't think you can just disregard that. They might not have wielded the IDE or camera or whatever, and you might believe that prompting and selecting which output you like has no value, but you can't claim there's no input or creativity required from the author. There is.
> Companies value velocity and new launches and shipping first at all costs because of course they do; it’s table stakes. Speed of delivery is basically the number one corporate value of every organization whether they admit to it or not.
Yeah this one is again one of the causes of where we are today (alongside profit extraction, or perhaps because of it). It used to be the case that you would find companies that would offer quality at a slightly higher price, and people would be more than willing to pay for it. Now the feeling is that this is all marketing driven and there is no 'higher quality' because everyone gave up and went after speed of delivery. And well, as the old saying goes, that's valuable when you're catching fleas.
I read it all, and found myself engaged throughout. Not to say that it was all riveting, there were certainly dryer spots than others, but it felt 'real'. Maybe they did use AI (I somehow doubt that given the content), but even if they did they went over everything in a way that retained a voice that felt authentic.
I hate many of the articles I read now all feel like they have the same half hearted attempt at trying to grab your attention without every actually clearly saying what they mean.
As for the content, I had actually just been told by management this last week that I need to become AI 'fluent' as part of future performance evaluations and I have been deeply conflicted about it. I do think AI has value to add, but I don't think it's something that should be forced and so this article resonated with me.
It's a long read, and not for everyone, but I recommend it as a way of hearing another humans opinion and deciding for yourself if it has value.
"I’m not arguing that this technology should be unilaterally destroyed; I am arguing that we are collectively using it in the dumbest possible way, causing the most self-inflicted injury, and maximizing the amount of angst and suffering we’ll all have to contend with. I am angry at generative AI because it seems to be making us think and act like complete idiots."
Made me smile.
I know this is probably a deliberate simplification as part of a rhetorical flourish but one of my favourite parts about semiconductors is the fact that we don't dig up the rocks, we grow them to order. The though fills me with childlike wonder...
The author has made the correct call. There's a pretty deep irony that all the top-level comments at the time of this writing are about how the article is too long. It's quite clearly not trying to succinctly convince you of a point, it's meant to be a piece of genuinely human writing, and enjoyed (or not!) on the basis of that.
I remember people saying this about emails vs postal mail.
Over sixteen thousand words about how the author doesn’t really use language models very much but might in the future
So you don't have to:
"you don’t have to embrace a trend, tool, or narrative simply because others say you should — especially if it doesn’t resonate with you or align with your values"
An important new twist to add to the great AI versus NO AI discussion.