We're not anywhere there yet, but we're closer than we've ever been, and things keep moving in the wrong direction.
I would also suggest that there is another user base who has been using computers for a long time, before GUIs existed, is fed up with fighting malware, welcomes the protection of a sandboxed, protected system, but doesn't understand the importance of having the option of escaping the sandbox. These users might not see the loss of not being able to install a kext on Mac OS without booting into Recovery Mode. But they will notice the loss when, at some point, we can't run anything that isn't signed on any platform.
Google and Microsoft are slowly moving towards the Apple model because it works as far as decreasing support costs go.
When the day comes that there isn't any hardware we can purchase that we can't install OpenBSD/Linux/whatever we want, it will be too late. We have to push back before then somehow.
There are more
AROS, GNU-HURD and more
you can always contribute code, maintain an app, report a bug
You can buy HW to run AOSP, like Raspberry-PI or RISC-V
We are the consumers, we have the wallet.
After that, certified locked down BigTech 'Personal Computing' will be the only menu choice.
So you’ll still be able to write code and scripts and play on the side on your laptop, but if you want to access your banks webpage (or really, anything you get through someone else’s server: streaming media, the news, porn, whatever) you’ll be forced to Chrome + laptop with TPM + authentication through smartphone app.
Not ideal.
Apart from the viruses, nothing of the above is true any more. Apple doesn't care if you're getting screwed over by an app, and neither does Google. If they can increase their profits by taking away our freedom and/or control over "our" devices, then it WILL happen, as sure as death and taxes.
Linux.
Protecting 1 million grannies is an entirely different risk class than the security implications of stopping everyone from using their devices as they see fit.
Protecting 1 million grannies means everyone loses ability to install apps that:
-allow encrypted chat
-allow use of privacy respecting software
-download art/games/entertainment that is deemed inappropriate to unelected parties
-use software to organize protests and track agents of hostile governments
-download software that opposes monopolistic holds of controlling parties
Using Linux is also not a real choice. To access my bank and health services in my country, I require a mobile device that is remote attested by either Apple or Google which are American countries. Hell, it's becoming closer to reality that playing online video games requires remote attestation either to "prevent" cheating or for age verification.Thus the risk widens to the sovereign control a nation has over its own services. A US president could attempt to force Google and Apple to shutoff citizen access of banks and health services of an entire nation. Merely the threat could give them leverage in any sort of negotiations they might be in. For some nations in the future, the controlling nation may be China I imagine.
I think the real regulatory solution here is to break up monopoly practices. While the EU's DMA is all well and good in some ways, the EU is also pushing Chat Control... In a more fragmented market it becomes impossible for a bank or health service to mandate specific devices for access (they lose potential customers) so you could theoretically move to a device that doesn't do draconian style remote attestation that breaks if you go off the ranch. We need more surgically precise regulatory tools than sweeping legislation that would keep using alternatives like Linux or FreeBSD or whatever actually viable. It also makes it much harder for that same legislative body to enforce insane ideas like Chat Control.
The answer is not protect users from themselves. The answer is more freedom, with a legal framework that helps all users have more choices while helping victims acquire restitution.
There exists no path where a publicly traded company doesn't eventually view its customers as subjects. Every business school on the planet is teaching their students strategies and tactics that squeeze their customers in pursuit of maximizing revenue. And those strategies and tactics are often at the expense of creativity, ethics, and community. Just last week people's bed didn't work because the company that makes them architected things such that they have absolute control.
Only a reasonably altruistic private company might buck the trend. But the publicly traded companies are allowed, by the government(s), to use their largesse in a predatory fashion to prevent competition. They bundle and bleed and leverage every step of the way. They not only contribute to the politicians that do their bidding, they are frequently asked to write the laws and regulations they're expected to follow. Magically, it has the effect of increasing the costs of their competition to enter the markets they dominate. And so, the odds of an altruistic private company emerging from that muck is low.
Worse still, many of the elected officials (and bureaucrats) actively own stock in the very companies they are responsible for regulating. Widespread corruption and perversion of the market is the inevitable result.
I'm trying to do a better job and redirect my money to the places that better reflect my values. It's not even a drop in the bucket, but it's a lever where I feel like I have a measure of control.
Probably won't help, but it is something.
Not for tablets or game consoles though.
The killer app for jailbreaking is usually running unlicensed games.
I remember seeing KDE and GNOME already have their "stores", we need to keep a close eye on Linux.
What would you include?
Computers nowadays are so weird.
We all now live with the blowback from that decision. Most people don't even realize that actually secure computing is a possibility now, even here on HN.
This general insecurity means that anything exposed to raw internet will be compromised and therefore significant resources must be expended to manage it, and recover after any incidents.
It's no wonder that most people don't want to actually run their own servers. Thus we give up control and this .... Situation .... Is the result.
This is historically inaccurate. All console games were originally produced in-house by the console manufacturers, but then 4 Atari programmers got wind that the games they wrote made tens of $millions for Atari while the programmers were paid only a relatively small salary. When Atari management refused to give the programmers a cut, they left and formed Activision. Thus Activision became the original third-party console game development company. Atari sued Activision for theft of trade secrets, because the Activision founders were all former Atari programmers. The case was settled, with Atari getting a cut of Activision’s revenue but otherwise allowing Activision to continue developing console games. I suspect this was because the 4 programmers were considered irreplaceable to Atari (albeit too late, after they already quit).
The licensing fee business model was a product of this unique set of circumstances. The article author's narrative makes it sound like consoles switched from open to closed, but that's not true. The consoles (like the iPhone) switched from totally closed to having a third-party platform, after the value of third-party developers was shown.
> Consumers loved having access to a library of clean and functional apps, built right into the device.
How can you say they're "built right into the device" when you have to download them? Moreover, you were originally able to buy iPhone apps in iTunes for Mac, and manage your iPhone via USB.
> Meanwhile, they didn’t really care that they couldn’t run whatever kooky app some random on the Internet had dreamed up.
I'm not sure how you can say consumers didn't really care. Some people have always cared. It's a tradeoff, though: you would have to care enough to not buy an iPhone altogether. That's not the same as not caring at all. Also, remember that for the first year, iPhone didn't even have third-party apps.
> At the time, this approach largely stayed within the console gaming world. It didn’t spread to actual computers because computers were tools. You didn’t buy a PC to consume content someone else curated for you.
I would say this was largely due to Steve Wozniack, who insisted that the Apple II be an open platform. If Steve Jobs—who always expressed contempt for third-party developers—originally had his way, the whole computing industry might have been very different. Jobs always considered them "freeloaders", which is ridiculous of course (for example, VisiCalc is responsible for much of the success off the Apple II), but that was his ridiculous view.
None of what was written in the rest of the article after this statement has any bearing on what they said in this statement. Sure, they said the "Microsoft Store", but aside from that, you still have the freedom of running whatever software you want on your own desktop computer, laptop computer, or server (Linux, Windows, or Macintosh) ... nothing has changed about this. I, for one, like the increased security on mobile devices. As far as gaming, I am not a gamer, so I just do not care.
> Apple sold the walled garden as a feature. It wasn’t ashamed or hiding the fact—it was proud of it... The iPhone’s locked-down nature wasn’t a restriction; it was a selling point.
Please, write as a human, I promise you it's good enough. I'd much rather read something that's a bit clunky but human written than something that's very polished but leaves me wondering what the author actually was trying to say.
Respect your reader, but most importantly, respect yourself as a writter too.
I don’t like that governments are forcing companies to open their environments up to random code, I wish they instead put legislation in place about transparent vetting processes, and allowing different kinds of apps.
In general I think software engineers get away with things no real engineering job gets away with, and it baffles me.
As much as I want to agree with this author (and do, to an extent) they are also providing the exact and honestly-pretty-good reasons for why this is happening: computers have breached containment, and they did it a long time ago. Computers are not just for us weird nerds anymore and they haven't been for some time; they're tools for a larger, more complicated, more diverse userbase, many of whom are simply not interested in learning how to computer. They just want shit to work, reliably. Random software on the Internet is not a path to reliability if you also don't know how your thing actually works.
I mourn this too but let's not pretend it's simply what happened because corporations are evil (though they are for sure that).
I am allowed to own multiple computers. Many do. I've got a Linux hand held, a windows desktop, an iPhone and a MacBook. All with varying degrees of freedom and function. I don't feel like I'm constrained right now.
HDCP is an example of the other thing in my mind. It adds zero value to anyone's experience. Any potential value add is hypothetical. You can't survey a person after they watch an unprotected film and receive a meaningful signal. It's pure downside for the customer. There's no such thing as competitive Netflix lobbies.
If I want to run arbitrary code, I'll do it on my windows box or fire up a Linux VM in the cloud somewhere. I don't need weird problems on my phone. If you are trying to touch all platforms at once, try using the goddamn web. I've been able to avoid Apple enterprise distribution hell with a little bit of SPA magic and InTune configuration for business customers. For B2C I just don't see it anymore. You need to follow the rules if you want to be in the curated environments.