Isn't familiarity with the language even more the case with a LLM. The language they do best with is the one with the largest corpus in the training set.
But that is the "fear" side of the enterprise sales equation... The "greed" side of it is for the buyer to make the long / short hedge.
The exec who gets the value of the working product can potentially come out shining, when their peers will be furiously backpedalling next year. And this consummate exec can do it by name-associating with their "main bet" which is optically great for the immediate term but totally out of their control (because big corp vendor will drag its feet like every SAP integration failure they've seen), and feeling a sense of agency by running an off-books skunkworks project that actually works and saves the day.
A fine needle to thread for the upstart, but better than standing outside the game.
In the same article the author was mentioning a few expert systems from the past that were quite obviously successful.
> on the promise printed on its marketing
Ah, _that_ promise. That promise is never fulfilled anywhere nor it is expected to.
The insight here is that this also still applies to huge enterprise contracts where supposedly more rational decision making should apply.
Imagine a model with a reliable 100M context window. Then all of a sudden you can.
> The information the intelligent answer needs was never in the wiki in the first place.
Oh well.
One should not underestimate a "compression primitive with a chat interface". For certain tasks it is a superpower.
That why vc look favorably to startup which go trough the motion of setting up partner led sales channel. an established partner taking maintenance contracts bridge the disconnect in the lifecycle gap between the two realities.
But no, corporate is bad, I guess.
As the article notes, the alternatives from the large companies suck. So this is like buying fire insurance from a company that promptly sets fire to your house. You are buying the insurance while knowing you will need it because the disaster is already happening.
This is correct and very agreeable to everyone, but then after some waffle they then write this:
> Structure, for the first time, can be produced from content instead of demanded from people
These quotes are very much at odds. Where is this structure and content supposed to come from if you just said that nobody makes it? Nowhere in that waffle is it explained clearly how this is really supposed to work. If you want to sell AI and not just grift, this is the part people are hung up on. Elsewhere in the article are stats on hallucination rates of the bigger offerings, and yet there's nothing to convince anyone this will do better other than a pinky promise.