> Landing projects for Set Studio has been extremely difficult, especially as we won’t work on product marketing for AI stuff, from a moral standpoint, but the vast majority of enquiries have been for exactly that
The market is speaking. Long-term you’ll find out who’s wrong, but the market can usually stay irrational for much longer than you can stay in business.
I think everyone in the programming education business is feeling the struggle right now. In my opinion this business died 2 years ago – https://swizec.com/blog/the-programming-tutorial-seo-industr...
That collapsed during the covid lockdowns. My financial services client cut loose all consultants and killed all 'non-essential' projects, even when mine (that they had already approved) would save them 400K a year, they did not care! Top down the word came to cut everyone -- so they did.
This trend is very much a top down push. Inorganic. People with skills and experience are viewed by HR and their AI software as risky to leave and unlikely to respond to whatever pressures they like to apply.
Since then it's been more of the same as far as consulting.
I've come to the conclusion I'm better served by working on smaller projects I want to build and not chasing big consulting dollars. I'm happier (now) but it took a while.
An unexpected benefit of all the pain was I like making things again... but I am using claude code and gemini. Amazing tools if you have experience already and you know what you want out of them -- otherwise they mainly produce crap in the hands of the masses.
I don't want to openly write about the financial side of things here but let's just say I don't have enough money to comfortably retire or stop working but course sales over the last 2-3 years have gotten to not even 5% of what it was in 2015-2021.
It went from "I'm super happy, this is my job with contracting on the side as a perfect technical circle of life" to "time to get a full time job".
Nothing changed on my end. I have kept putting out free blog posts and videos for the last 10 years. It's just traffic has gone down to 20x less than it used to be. Traffic dictates sales and that's how I think I arrived in this situation.
It does suck to wake up most days knowing you have at least 5 courses worth of content in your head that you could make but can't spend the time to make them because your time is allocated elsewhere. It takes usually 2-3 full time months to create a decent sized course, from planning to done. Then ongoing maintenance. None of this is a problem if it generates income (it's a fun process), but it's a problem given the scope of time it takes.
We should have more posts like this. It should be okay to be worried, to admit that we are having difficulties. It might reach someone else who otherwise feels alone in a sea of successful hustlers. It might also just get someone the help they need or form a community around solving the problem.
I also appreciate their resolve. We rarely hear from people being uncompromising on principles that have a clear price. Some people would rather ride their business into the ground than sell out. I say I would, but I don’t know if I would really have the guts.
If all of "AI stuff" is a "no" for you, then I think you just signed out off working in most industries to some important degree going forward.
This is also not to say that service providers should not have any moral standards. I just don't understand the expectation in this particular case. You ignore what the market wants and where a lot/most of new capital turns up. What's the idea? You are a service provider, you are not a market maker. If you refuse service with the market that exists, you don't have a market.
Regardless, I really like their aesthetics (which we need more of in the world) and do hope that they find a way to make it work for themselves.
I started TextQuery[1] with same moralistic standing. Not in respect of using AI or not, but that most software industry is suffering from rot that places more importance on making money, forcing subscription vs making something beautiful and detail-focused. I poured time in optimizing selections, perfecting autocomplete, and wrestling with Monaco’s thin documentation. However, I failed to make it sustainable business. My motivation ran out. And what I thought would be fun multi-year journey, collapsed into burnout and a dead-end project.
I have to say my time was better spent on building something sustainable, making more money, and optimizing the details once having that. It was naïve to obsess over subtleties that only a handful of users would ever notice.
There’s nothing wrong with taking pride in your work, but you can’t ignore what the market actually values, because that's what will make you money, and that's what will keep your business and motivation alive.
I think this is the crux of the entire problem for the author. The author is certain, not just hesitant, that any contribution they would make to project involving AI equals contribution to some imagined evil ( oddly, without explictly naming what they envision so it is harder to respond to ). I have my personal qualms, but run those through my internal ethics to see if there is conflict. Unless author predicts 'prime intellect' type of catastrophe, I think the note is either shifting blame and just justifying bad outcomes with moralistic: 'I did the right thing' while not explaining the assumptions in place.
"To run your business with your personal romance of how things should be versus how they are is literally the great vulnerability of business."
Since then I pivoted to AI and Gen AI startups- money is tight and I dont have health insurance but at least I have a job…
Market has changed -> we disagree -> we still disagree -> business is bad.
It is indeed hard to swim against the current. People have different principles and I respect that, I just rarely - have so much difficulty understanding them - see such clear impact on the bottom line
In this case, running a studio without using or promoting AI becomes a kind of sub-game that can be “won” on principle, even if it means losing the actual game that determines whether the business survives. The studio is turning down all AI-related work, and it’s not surprising that the business is now struggling.
I’m not saying the underlying principle is right or wrong, nor do I know the internal dynamics and opinions of their team. But in this case the cost of holding that stance doesn’t fall just on the owner, it also falls on the people who work there.
Links:
We Brits simply don't have the same American attitude towards business. A lot of Americans simply can't understand that chasing riches at any cost is not a particularly European trait. (We understand how things are in the US. It's not a matter of just needing to "get it" and seeing the light)
I have a family member that produces training courses for salespeople; she's doing fantastic.
This reminds me of some similar startup advice of: don't sell to musicians. They don't have any money, and they're well-versed in scrappy research to fill their needs.
Finally, if you're against AI, you might have missed how good of a learning tool LLMs can be. The ability to ask _any_ question, rather than being stuck-on-video-rails, is huge time-saver.
we say that wordpress would kill front end but years later people still employ developer to fix wordpress mess
same thing would happen with AI generated website
They have a right to do business with whomever they wish. I'm not suggesting that they change this. However they need to face current reality. What value-add can they provide in areas not impacted by AI?
I'd much rather see these kind of posts on the front page. They're well thought-out and I appreciate the honesty.
I think that, when you're busy following the market, you lose what works for you. For example, most business communication happens through push based traffic. You get assigned work and you have x time to solve all this. If you don't, we'll have some extremely tedious reflection meeting that leads to nowhere. Why not do pull-based work, where you get done what you get done?
Is the issue here that customers aren't informed about when a feature is implemented? Because the alternative is promising date X and delaying it 3 times because customer B is more important
I intentionally ignored the biggest invention of the 21st century out of strange personal beliefs and now my business is going bankrupt
Any white-collar field—high-skill or not—that can be solved logically will eventually face the same pressure. The deeper issue is that society still has no coherent response to a structural problem: skills that take 10+ years to master can now be copied by an AI almost overnight.
People talk about “reskilling” and “personal responsibility,” but those terms hide the fact that surviving the AI era doesn’t just mean learning to use AI tools in your current job. It’s not that simple.
I don’t have a definitive answer either. I’m just trying, every day, to use AI in my work well enough to stay ahead of the wave.
The market is literally telling them what it wants and potential customers are asking them for work but they are declining it from "a moral standpoint"
and instead blaming "a combination of limping economies, tariffs, even more political instability and a severe cost of living crisis"
This is a failure of leadership at the company. Adapt or die, your bank account doesn't care about your moral redlines.
I fundamentally disagree with this stance. Labeling a whole category of technologies because of some perceived immorality that exists within the process of training, regardless of how, seems irrational.
"Moral" is mentioned 91 times at last count.
Where is that coming from? I understand AI is a large part of the discussion. But then where is /that/ coming from? And what do people mean by "moral"?
EDIT: Well, he mentions "moral" in the first paragraph. The rest is pity posting, so to answer my question - morals is one of the few generally interesting things in the post. But in the last year I've noticed a lot more talking about "morals" on HN. "Our morals", "he's not moral", etc. Anyone else?
From the clients perspective, it's their job to set the principles (or lack thereof) and your job to follow their instructions.
That doesn't mean it's the wrong thing to do though. Ethics are important, but recognise that it may just be for the sake of your "soul".
The equivalent of that comic where the cyclist intentionally spoke-jams themselves and then acts surprised when they hit the dirt.
But since the author puts moral high horse jockeying above money, they've gotten what they paid for - an opportunity to pretend they're a victim and morally righteous.
Par for the course
I hope things turn around for them it seems like they do good work
Can someone explain this?
I hope things with the AI will settle soon and there will be applications that actually make sense and some sort of new balance will be established. Right now it's a nightmare. Everyone wants everything with the AI.
What am I missing?
Although there’s a ton of hype in “AI” right now (and most products are over-promising and under-delivering), this seems like a strange hill to die on.
imo LLMs are (currently) good at 3 things:
1. Education
2. Structuring unstructured data
3. Turning natural language into code
From this viewpoint, it seems there is a lot of opportunity to both help new clients as well as create more compelling courses for your students.
No need to buy the hype, but no reason to die from it either.
I don't use AI tools in my own work (programming and system admin). I won't work for Meta, Palantir, Microsoft, and some others because I have to take a moral stand somewhere.
If a customer wants to use AI or sell AI (whatever that means), I will work with them. But I won't use AI to get the work done, not out of any moral qualm but because I think of AI-generated code as junk and a waste of my time.
At this point I can make more money fixing AI-generated vibe coded crap than I could coaxing Claude to write it. End-user programming creates more opportunity for senior programmers, but will deprive the industry of talented juniors. Short-term thinking will hurt businesses in a few years, but no one counting their stock options today cares about a talent shortage a decade away.
I looked at the sites linked from the article. Nice work. Even so I think hand-crafted front-end work turned into a commodity some time ago, and now the onslaught of AI slop will kill it off. Those of us in the business of web sites and apps can appreciate mastery of HTML and CSS and Javascript, beautiful designs and user-oriented interfaces. Sadly most business owners don't care that much and lack the perspective to tell good work from bad. Most users don't care either. My evidence: 90% of public web sites. No one thinks WordPress got the market share it has because of technical excellence or how it enables beautiful designs and UI. Before LLMs could crank out web sites we had an army of amateur designers and business owners doing it with WordPressl, paying $10/hr or less on Upwork and Fiverr.
You will continue to lose business, if you ignore all the 'AI stuff'. AI is here to stay, and putting your head in the sand will only leave you further behind.
I've known people over the years that took stands on various things like JavaScript frameworks becoming popular (and they refused to use them) and the end result was less work and eventually being pushed out of the industry.
That's horrifying.
Sounds like a self inflicted wound. No kids I assume?
Two fundamental laws of nature: the strong prey on the weak, and survival of the fittest.
Therefore, why is it that those who survive are not the strong preying on the weak, but rather the "fittest"?
Next year's development of AI may be even more astonishing, continuing to kill off large companies and small teams unable to adapt to the market. Only by constantly adapting can we survive in this fierce competition.
Ironically, while ChatGPT isn’t a great writer, I was even more annoyed by the tone of this article and the incredible overuse of italics for emphasis.