- OpenAI is just a wrapper around NVIDIA, which is just a wrapper around TSMC, which is just a wrapper around ASML, which is just a wrapper around Zeiss optics, which is just a wrapper around EUV photons, which are just wrappers around quarks, which are just wrappers around quantum fields...
A Large Language Model is just a Large Hadron Model with better marketing.
- > But Cursor and other such tools depend almost entirely on accessing Anthropic, OpenAI and Gemini models, until open-source open-weight and in-house models match or exceed frontier models in quality.
I'm not sure I agree with this because even though Cursor is pay north of 100% of revenues to Athropic, Anthropic is selling inference at a loss. So if Cursor builds and hosts its own models it still has the marginal costs > marginal revenues problem.
The way out for Cursor could be a self-hosted much smaller model that focuses on code, and not the world. This could have inference costs lower than marginal revenues.
by mentalgear
1 subcomments
- > But I think the insight lies between these positions. Even if a new application starts as a wrapper, it can endure if it lives where work is done, writes to proprietary systems of record, builds proprietary data and learns from usage, and/or captures distribution before incumbents bundle the feature.
Basically the same as MS & Social Media did: build a proprietary silo around data, amass enough data, so it will become too big an inconvenience to move away from the first provider.
It's good that the EU has laws now to ensure data interoperability, export & ownership.
by keiferski
1 subcomments
- Marketing, UI, and brand matter a lot. Especially when all of the products are functionally "the same" to the average consumer, who doesn't care about technical details and benchmarks, etc. It reminds me of this great scene from Mad Men:
This is the greatest advertising opportunity since the invention of cereal. We have six identical companies making six identical products. We can say anything we want.
https://youtu.be/8SsnkXH2mQY?si=SWPOsGBel1yh3kMd&t=198
by _fat_santa
1 subcomments
- Currently working on a SaaS app that could be called an "AI Wrapper". One thing I picked up on is once you start using AI tools programmatically, you can start doing far more complex things than what you can with ChatGPT or Claude.
One thing we've leaned heavily into was using Langgraph for agentic workflows and it's really opened the door to cool ways you can use AI. These days the way I tell apart an AI "Wrappers" vs "Tools" is what is the underlying paradigm. Most "wrappers" just copy the paradigm of ChatGPT/Claude where you have a conversation with an agent, the "tools" are where you take the ability to generate content and then plug that into a broader workflow.
by lubujackson
2 subcomments
- Almost every startup is a wrapper of some sort, and has been for a while. The reason a startup can startup is because it has some baked in competency by using new and underutilized tools. In the dot com boom, that was the internet itself.
Now it's AI. Only after doing this for 20+ years do I really appreciate that the arduous process and product winnowing that happens over time is the bulk of the value (and the moat, when none other exists).
- Software is 10x more valuable than inference tokens because tokens do nothing for the user, just like a database request by itself does nothing.
Software is what makes inference valuable because it builds a workflow that transforms tokens and data into practical benefits.
Look at the payment plans for Lovable, Figma Make, Claude Code. None of them charge by token. They charge by obfuscated 'credits'. We don't know the current credit economics, but it is certain that the credit markup will increase and probably eventually reach 10x of the token cost. Users will gladly pay for it because again, tokens do nothing for them. It is the Claude Code, Figma Make products that make them productive.
by darepublic
0 subcomment
- as others have mentioned -- I think wrapper is a fair term. It is not trivial and took untold man hours of research and labour to go from nvidia gpus to modern llms. some of the ai products really do feel like minimal engineering around calls to openai (or claude or what have you)
by TrackerFF
1 subcomments
- What do we call companies that spin up open-source ML models, and basically just sell access to said models?
One example is audio / stem separation, object segmentation.
They're not wrappers, but whatever that is one step deeper down in complexity.
- i recently framed this as "agent labs" vs "model labs" - https://latent.space/p/agent-labs - definitely far from proven or given that they are a lasting business model, but i think the dynamic is at least more evident now than it was a year ago and even that is notable as we are slowly figuring out what the new ai economy looks like
by kayart_dev
0 subcomment
- It’s hard to count how many ffmpeg wrappers are there
- [dead]
by Barry-Perkins
0 subcomment
- [flagged]
by Jeff-Collins
0 subcomment
- [flagged]