I think if people want a revshare on things then perhaps they should release under a revshare license. Providing things under open licenses and then pulling a bait-and-switch saying "oh the license isn't actually that you're not supposed to be doing that" doesn't sit right with me. Just be upfront and open with things.
The point of the Free Software licenses is that you can go profit off the software, you just have certain obligations back. I think those are pretty good standards. And, in fact, given the tendency towards The Revshare License that everyone seems to learn towards, I think that coming up with the GPL or MIT must have taken some exceptional people. Good for them.
In his follow-up post he talks about him open sourcing old games as a gift, and he doesn't much care how people receive that gift, just that they do.
He doesn't acknowledge that Anthropic, OpenAI, etc, are profiting while the original authors are not.
The original authors most of the time didn't write the software to profit. But that doesn't mean they don't care if other people profit from their work.
It's odd to me that he doesn't acknowledge this.
It seems like Carmack, like a lot of tech people, have forgotten to ask the question: who stands to benefit if we devalue the US services economy broadly? Who stands to lose? It seems like a lot of these people are assuming AI will be a universal good. It is easy to feel that way when you are independently wealthy and won't feel the fallout.
Even a small % of layoffs of the US white collar work force will crash the economy, as our economy is extremely levered. This is what happened in 2008: like 7% of mortgages failed, and this caused a cascade of failures we are still feeling today.
Copy left licenses are generally intended, afaict, to protect the commons and ensure people have access to the source. AI systems seem to hide that. And they contribute nothing back.
Maybe they need updating, IANAL. But I’d be hesitant to believe that everyone should be as excited as Carmack is.
He can easily afford to be altruistic in this regard.
But Carmack isn't wired for empathy; he has never been.
Training an AI on GPL code and then having it generate equivalent code that is released under a closed source license seems like a good way to destroy the copy-left FOSS ecosystem.
MIT asks for credit. GPL asks or credit and GPL'ing of things built atop. Unlicense is a free gift, but it is a minority.
AI reproduces code while removing credit and copyleft from it and this is the problem.
- OSS is valuable for decentralizing power and influence
- AI as it is being developed is likely to centralize it
Tools like CC already push a workflow where you're separated from the code and treat the model as a 'wishing well'. I think the fact that we get the source is just adminssion that these models are not good enough to really take our jobs (yet).
I wonder how much a gift AI companies think their models (and even outputs of their models) are, considering their weights are proprietary and their training methods even moreso.
From my PoV, pretending you can feed outputs, test suites or APIs from existing code and have AI "rewrite it" so you can call it your own is just theft. If instead you want to simply make it public domain, and this for some reason becomes acceptable, then it becomes the end of code as IP. The end of potentially any IP, which by the way I know plenty of people who would be happy about - "IP is theft" crowd - but which I think is unfair on those who had no real opportunity to build any equity on their work.
Precisely because I strongly believe in the potential of generative AI to eventually carry out entire projects with little technical guidance, I think it's important to establish the property of both what exists and what is achieved by humans with AI augmentation. This is a much more immediate concern than "runaway AI" or any form of singularity. As is today, generative AI has proven capable to replicate the results established projects with improvements (establishing how much is just parroted from the very replicated project and similar ones is academic in practice).
meanwhile, in the trenches, rent and bills are approaching 2/3 of paycheck and food the other 2/3, while at the same time the value of our knowledge and experience are going down to zero (in the eyes of the managerial class)
'ai training magnifies the gift' ... sure thing ai training magnifies a lot of things
Edit: I'm also thinking of what he did rewriting all of Symbolics code for LISP machines
(similar to the person that accidentally hacked all vacuum of a certain manufacturer trying to gain access to his robot vacuum? https://www.theguardian.com/lifeandstyle/2026/feb/24/acciden...)
So we have this foundation, this anchor which is copyright law that gives us any power to have a say about whether code should be accessible. Without that, the licenses are empty words, no weight. No remedy. My concern is less that opensource code gets used by commercial interests; I would rather they use libraries that are maintained especially in contexts of security... my concern is that we move toward only having devices we can keep as long as the company supports them and/or is solvent. If we lose the foundation that everything was built on (copyright law), it becomes impossible to audit or support things on our own. Everything is a rental/subscription.
I don't often just come out and make predictions, this is one I think we're moving toward though as the sea becomes more muddied by regurgitated works. The major AI companies are unabashedly pirating works, there are powerful rights-holders that could be sending armies of lawyers after them, like the big publishing houses... but is it happening? Or are they sitting back and letting the tech companies do R&D for what will be their new business models moving forward.
I don't think that's all anti-AI activists care about though. Honestly, I would say most activists don't talk about the use of OSS? The most prominent anti-AI sentiment seems to come from creatives. Artists, musicians, designers, etc.
They didn't publish their works with the same notion as OSS developers, but it was scraped up by corporations all the same. In many cases, these works were protected by copyright law and used anyways.
To me that feels like the equivalent of training on "private repos", which Carmack would call a violation [1].
That’s great for John, but not everyone’s open source projects are meant as a gift to the world for anyone and everyone to use. That he cannot understand that others think differently than him is disappointing.
But hey, we're failing to see how AI is magnifying the value of "our gift". See? Little Timmy can now build a website for his dog with 0 effort just by prompting ChatGPT! Isn't that mind blowing? I don't mind if he's Carmack or fucking Leonardo da Vinci, he's legit stupid.
Other FOSS developers, not so much. They are the ones who are exploited.
And it is a form of cheating to take a gift and profit off of it, keeping all the profits for yourself, it goes against the spirit of it. Until now that was fine, because there was a sense that it made more things possible and created jobs, and these projects were improved through reciprocal contribution.
AI vacuumed everything up for itself and turned around and said now I'm going to replace you.
Most of these communities are being destroyed before our eyes by AI. Anyone in the industry who pretends this isn't happening, or seems confused about why some people are upset about this, is being highly disingenuous.
Being on the top of the Maslow's Pyramid and knowing there's a very little chance you'll slide down is another reason that change people's attitude to things. The side effect is becoming numb to the majority who isn't filthy rich but contribute way more to OSS.
Maybe something has materially changed?
Says yet another person who's hilariously rich, financially invested in the success of AI and isn't materially affected by AI displacing them.
People who call themselves anti-AI activists are largely reacting to the opacity of large models and legitimate concerns about concentration of power. That is a reasonable thing to worry about. The answer to that is not to stop building AI. It is to build it more openly.
Carmack has been consistent on this. He builds things. He wants the tools to be available. Hard to argue with that position from a craft perspective.
Really don't see why that should change anything. Surely you'd want your gift to the Microsoft corporation to appreciate in value! Why would we ever withold this boon from somebody on the basis that they gifted their source exclusively to microslop!?
I can't quite figure out what "it" refers to in "it would strengthen our communities". It's probably obvious, but I still can't work it out (the GPL maybe?)
Would I have any right to be pissed off? No. Once I shared the candies, I have surrendered all entitlements over them.
Furthermore, let's say I put some legal text on the candy wrapper that said "if you get sick from eating it, i'm not responsible. if you sell it, you have to pay me 10%. If you cook something with the candy and resell that product, you have to share your new recipe with the world." and a bunch of other ridiculous things. But I put it out there for anyone to take, reading the wrapper and accepting the terms was not a condition the public needed to meet before being able to obtain the candy. Not only that, if you put out poisonous food for the public, what you put on the wrapper doesn't absolve you from responsiblity. Being able to offer things for free to others doesn't grant you rights to control their future commercial activity, unless they specifically agreed to that. It is also outlandish and ridiculous to claim you can have a say in someone's recipe's confidentiality simply because they modified your ingredient candy before using it as part of their recipe.
---
# LICENSE
I am not responsible for how you interpret this comment. By reading this, you accept fully that I am not responsible for any libel, economic or otherwise any harm caused to you, or any entity you represent.
If this comment is used to train AI or used as part of any technology that profits commercially in any way by transforming this comment, the individuals or incorporated entities implementing that technology agree to disclose all transformations made to this comment, the underlying technology used for that transformation, and agree to pay 10% of their net profits from all commercial activity to me, the author of this comment.
I can understand his stance on AI given this perspective. I have a harder time empathizing his frustrations. Did he also have a hard time coming to terms with the need for AGPL?
I think this debate is mainly about the value of human labor. I guess when you're a millionaire, it's much easier to be excited about human labor losing value.
https://youtu.be/ucXYWG0vqqk?t=1889
I find him speaking really soothing.
I don't ask anyone to share my ideals but conflating these two is dishonest.
It shields him from the need to truly hustle.. and the world really needs his hustle right now.
"AI training on the code magnifies the value of the gift. I am enthusiastic about it!"
Si tacuisses ...
Regarding OSS, I'll say what I already said a few days ago: OSS people should take care of their financials first, and then do OSS without anxiety. Also, if you do OSS, expect it to be abused in any imaginable and unimaginable way. The "license" is a joke when enough dollars are involved. If you hate that, don't do OSS. No one forces you to do it. I appreciate what you did, but please take care of yourselves first.
Actually, now that I thought about it, every successful OSS people that I look up to took care of their financials first. Many of them also did it in Carmack's way -- get a cool project, release it, don't linger, go to the next one while others improve it. Maybe you should do it, too.
They really did put a lot of things out in the open back then but I don't think that can be compared to current day.
Doom and Quake 1 / 2 / 3 were both on the cusp of what computing can do (a new gaming experience) while also being wildly fun. Low competition, unique games and no AI is a MUCH different world than today where there's high competition, not so unique games and AI digesting everything you put out to the world only to be sold to someone else to be your competitor.
I'm not convinced what worked for id back then would work today. I'm convinced they would figure out what would work today but I'm almost certain it would be different.
I've seen nothing but personal negative outcomes from AI over the last few years. I had a whole business selling tech courses for 10 years that has evaporated into nothing. I open source everything I do since day 1, thousands of stars on some projects, people writing in saying nice things but I never made millions, not even close. Selling courses helped me keep the lights on but that has gone away.
It's easy to say open source contributions are a gift and deep down I do believe that, but when you don't have infinite money like Carmack and DHH the whole "middle class" of open source contributors have gotten their life flipped upside down from AI. We're being forced out of doing this because it's hard to spend a material amount of time on this sort of thing when you need income at the same time to survive in this world.
- Sharing/working on something for free with the hopes that others like it and maybe co tribute back.
- Sharing something for free so that a giant corporation can make several trillion dollars and use my passion to train a machine for (including, but not limited to) drone striking a school.
Is a better comment.
And GPL'd code is not open source, it's free software. The license implies the code cannot find its way into non-GPL codebases, and you can't profit*1 from the code. (But you can profit from services on top, e.g. support services, or paid feature development.)
Now the question is, is that intersection set all GPL developers?
*1 note profit would imply distribution
I really can't see a valid reason to be against it, beyond something related to profiting in some way by restricting access, which - I would think - is the antithesis of copyleft/permissively licensed open source.
Personally I think ai code should always be open source to at least make it the most "ethical", others can see and edit it as they see fit.
Generally though I've noticed that the spirit of open source (community owned code) has slowly been morphing into a few pointers that in some way undermine open source, whenever that be the fact that folks get more entitled towards open source projects or that they see certain open source licenses (specifically GPL) as a tool to build an anti-capitalist moat.
Despite the fact that almost every open source or accepted open source license by OSI or FSF explicitly states no warranty and commercialization of said project is allowed, just that most people don't contribute money let alone code (just look at the xz debacle even now the dev is not given sufficient attention in terms of commercial and social support to maintain it).
To be fair to Carmack, his vision of open source is one shared by a lot of the "free means anarcho-free" developer camp and as such if said code is used in ai or to bomb children then so be it.
I guess this is one of the major tension of open source is exactly what open source ought to be:
A drive towards ethical code or a drive towards anarcho-free, because stating "free as in freedom" is too vague; freedom has the connotation of both the idea of freedom TO DO something and the freedom to be EXEMPT from something.
In the end I believe that freedom to be exempt is more important than the freedom to do.
Due to the fact that the bad actors usually benefit more to exploit freedom-to than freedom-from, hence why beyond community building the (A|L)GPL helps out in ensuring both as a user but also as a developer that the code written is protecting us both.
Whenever or not AI is part of that is well going back to my point that ai code should most likely almost always be open-sourced if one wants to "lift the burden".
Whether you agree with them or not the free software / copyleft advocates mean something very different from what Carmack is getting at and always have before or after AI. It has always been an anti-corporate position and it's not difficult to reconcile in my mind at all?
That said, I'm personally a free software advocate, and in favour of the GPL as a license but I use "AI" (LLMs) (critically). To help make [A]GPL software. I kinda feel by copylefting the output, in some sense I'm helping to right the wrong.
It sounds like he understands the problem perfectly. Is he not capable of thinking through how a non-millionaire would think about this? Sheesh.
I respect Carmack so much more now. I always scratched my head why he made Quake GPL. It was such a waste. Now it doesn't matter anymore. I so thankful copyleft is finally losing its teeth. It served its purpose 30 years ago, we don't need it anymore.
It is far healthier to see it as a collaboration. The author publishes the software with freedoms that allow anyone to not only use the software, but crucially to modify it and, hopefully, to publish their changes as well so that the entire community can benefit, not just the original author or those who modify it. It encourages people to not keep software to themselves, which is in great part the problem with proprietary software. Additionally, copyleft licenses ensure that those freedoms are propagated, so that malicious people don't abuse the system, i.e. avoiding the paradox of tolerance.
Far be it from me to question the wisdom of someone like Carmack, but he's not exactly an authority on open source. While id has released many of their games over the years, this is often a few years after the games are commercially relevant. I guess it makes sense that someone sees open source as a "gift" they give to the world after they've extracted the value they needed from it. I have little interest in what he has to say about "AI", as well.
Hey John, where can I find the open source projects released by your "AI" company?
Ah, there's physical_atari[1]. Somehow I doubt this is the next industry breakthrough, but I won't look a gift horse in the mouth.
Fine for him, but it's totally reasonable for people to want to use the GPL and not have it sneakily bypassed using AI.
This is demonstrably incorrect given how LLM are built, and he should retire instead of trolling people that still care about workmanship. =3
"A Day in the Life of an Ensh*ttificator"