Carpenters, plumbers, masons & electricians work on houses 3-300 yrs old, navigate the range of legacy styles & tech they encounter, and predictably get good outcomes.
Only C has, yet, given use that level of serviceability. C99, baby, why pay more?
When there’s an alternative that can compete with that sort of real-world durability, C will get displaced.
I could write a whole essay about why, but now isn’t the time. I’m just going to enjoy the fact that TFA and the author don’t get it.
- unspecified default type sizes. Should have had i8, u16, i32, u64, f32, f64 from the beginning.
- aliasing pointers being restricted by default (ie an alias keyword should have been added). Performance matters. All these benchmarks which show something beating C or C++ are mostly due to dealing with aliasing pointers. C++26 still doesnt have standardised restrict keyword.
There are more but I understand the logic/usability/history behind them. The above points should have been addressed in the 80’s.
It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.
(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)
[1] I absolutely buy the argument that HTTP probably wins out for out of process
There's an argument for full type info at an API, but that gets complicated across languages. Things that do that degenerate into CORBA. Size info, though, is meaningful at the machine level, and ought to be there.
Apple originally had Pascal APIs for the Mac, which did carry along size info. But they caved and went with C APIs.
Good read though. Thinking about C as not just a language but also a protocol is a different perspective that is useful for the mental model.
There has to be an ABI that has to be agreed upon by everyone. Otherwise there wouldn’t be any interoperability. And if we didn’t have the SystemV ABI — what would we use instead? Prepare for a long debate as every language author, operating system designer, and platform under the sun argues for their respective demands and proposals. And as sure as the sun rises in the East someone, somewhere, would write an article such as this one decrying that blessed ABI.
SystemV shouldn’t be the be all and end all, IMO. But progress should be incremental. Because a lingua franca loses its primary feature and utility when we all return to our own fiefdoms and stop talking to one another in the common tongue.
It’s a pain in the metaphorical butt. But it’s better, IMO, than the alternatives. It’s kind of neat that SystemV works so well let alone at all.
The real protocol in action here is symbolic linking and hardware call ABIs.
You could always directly call Rust functions, but you'd have to know where to symbolically look for them and how to craft its parameters for example.
If this is well defined then its possible. If its poorly defined or implementation specific (c++) then yeah its a shit show that is not solvable.
The whole world shouldn't "need to be fixed" because you won't spend the time to learn something.
Rust doesn't even have a stable Internal ABI that's why you have to re-compile everything all the time.
That's exactly my case. For my programming language I have wrote a tool for C headers conversion using libclang. And even with help of this library it wasn't that easy, I have found a lot of caveats by trying converting headers like <windows.h>.
With C++ it's the same. Within the Haiku code it's half understandable, the whole spec it's to get driven mad in days.
Eg. here, from memory:
> ...you want to read 32 bits from file but OH NOOES long is 64 bit ! The language ! The imposibility !
But when you read something ot unserialize some format you just need to know based on format schema or domain knowledge. Simple and straightforward like that ! You do not do some "reflections" on what language standard provide and then expect someone send you just that !!
So that anti-C "movement" is mostly based on brainless exampless.
Not saying C is perfect.
But it is very good and I bet IBM and other big corps will keep selling things written and actively developed in C/C++ + adding hefty consulting fees.
In the meantime proles has been adviced to move to cpu-cycle-eating inferior languages and layers over layers of cycle burning infra in cloud-level zero-privacy and guaranteed data leaks.
Oh, btw. that femous Java "bean" is just object with usually language delivered "basic type"... How that poor programmer from article should know what to read from disc when he just have types Java provides ?? How ? Or maybe he should use some domain knowledge or schema for problem he is trying to solve ??
And in "scripting language" with automatic int's - how to even know how many bits runtime/vm actually use ? Maybe some reflection to check type ? But again how that even helps if there is no knowledge in brain how many bits should be read ?? But calling some cycle burning reflection or virtual and as much as posible indirect things is what fat tigers love the moust :)
We didn't do it to annoy you or to foist bad APIs on you. We did it because it was the best language for writing machine code at the time. By miles. Not understanding why this is true will lead you to make all the same mistakes the languages "bested" by C made.