The bottleneck was never typing. It was judgment. Tailwind is crystallized judgment. AI can consume it endlessly. Producing the next version requires the loop that creates metis, and that loop isn't in the training data.
It's a simple convenience utility belt that LLMs can already automate.
Both open-source and open-core need to be re-evaluated as labor value plummets.
I also disagree with the "why". Tailwind is extremely useful with LLMs as it can set styling inside of HTML, rather than maintain an external, typically massive, convoluted CSS file.
It's for the same reason that typing in programming will become a standard with LLMs: eliminate implicit/semantic density with explicit/semantic precision.
In both examples an LLM can have a strong understanding of a single file/module without needing to search for its meaning externally
Regardless of what happens to the company (my personal opinion is that they’ll come out of this strongee than before) Tailwind as OSS probably isn’t going anywhere for the foreseeable future.
LLM's are extremely adept at turning "what Tailwind does" into something "you don't have to pay for."
Yes, generative AI destroys some code-related business models. Absolutely fine by me, I'd rather work towards more people being able to be more creative with code than some company putting code tools behind paywalls or whatnot.