(I'm aware that Battlefield series and League of Legends won't work due to draconian anti-cheat -- but nobody in my group cares to play those I guess.)
(cue superiority complex) I've been using Linux Desktop for over 10 years. It's great for literally everything. Gaming admittedly is like 8/10 for compatibility, but I just use a VM with PCIe passthrough to pass in a gpu and to load up a game for windows or use CAD, etc. Seriously, ez.
Never had issues with NVIDIA GFX with any of the desktop cards. Laptops... sure they glitch out.
Originally Wine, then Proton, now Bazzite make it super easy to game natively. The only issues I ever had with games were from the Kernel level anti-cheats bundled. The anti-cheats just weren't available for Linux, so the games didn't start. Anyone familiar with those knows its not a linux thing, it's a publisher/anti-cheat mechanism thing. Just lazy devs really.
(cue opinionated anti-corporate ideology) I like to keep microsoft chained up in a VM where it belongs so can't do it's shady crap. Also with a VM you can do shared folders and clipboard. Super handy actually.
Weirdly enough, MacOS in a VM is a huge pita, and doesn't work well.
- F1 2024 didn't load due to anti-cheat
- Dragon's Dogma 2 and Resident Evil 4 had non-functional raytracing
- Cyberpunk 2077 with raytracing on consistently crashes when reloading a save game
- Dying Light 2 occasionally freezes for a whole minute
- Starfield takes 25 minutes to compile shaders on first run, and framerates for Nvidia are halved compared to Windows
- Black Myth: Wukong judders badly on Nvidia cards
- Baldur's Gate 3 Linux build is a slideshow on Nvidia cards, and the Windows build fails for some AMD cards
If you research these games in discussion forums, you can find some configuration tweaks which might fix the issues. ProtonDB's rating is not a perfect indicator (BM:W is rated "platinum").
And while Steve says measurements from Linux and Windows are not directly comparable, I did so anyway and saw that Linux suffers a 10-30% drop in average FPS across the board when compared to Windows, depending on the game and video card.
Tried running Worms: instant crash, no error message.
Tried running Among Us: instant crash, had to add cryptic arguments to the command line to get it to run.
Tried running Parkitect: crashes after 5 minutes.
These three games are extremely simple, graphically speaking. They don't use any complicated anti-cheat measure. This shouldn't be complicated, yet it is.
Oh and I'm using Arch (BTW), the exact distro SteamOS is based on.
And of course, as always, those for which it works will tell you you're doing-it-wrong™ .
I'd say it pretty much "just works" except less popular apps are a bit more work to install. On occasion you have to compile apps from source, but it's usually relatively straightforward and on the upside you get the latest version :)
For anyone who is a developer professionally I'd say the pros outweigh the cons at this point for your work machine.
(My customer demographic is seniors & casual users).
1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).
2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.
Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.
The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.
Prior to that windows was better on laptops due to having the proprietary drivers or working ACPI. But it was pretty poor quality in terms of reliability, and the main problem of the included software being incredibly bare bones, combined with the experience of finding and installing software was so awful (especially if you've not got an unlimited credit card to pay for "big professional solutions").
Every time the year of the Linux desktop arrives, I'm baffled, since not much has changed on this end.
It's MY system and I do whatever the heck I want, from play boring stuff to weird prototyping. I get no, like literally 0, anti-feature. I'm not "scared" that an update will limit my agency. I'm just zen and that is priceless.
Also, quite importantly, it works wonderfully with all my other devices and peripherals. I go from Bluetooh headsets easily, I switch monitors, video projectors, XR devices, CV camera inputs, I share files with KDE Connect, I receive SMS notification from my (deGoogled) Android phone, I reply from my desktop, I get notification when my SteamDeck is soon out of battery, etc. ALL my devices play nicely with each other.
So yes, Linux is good now. It's been for a while but it's been even better for the last few years.
Debian is a breath of fresh air in comparison. Totally quiet and snappy.
Experience is slowly getting better. There is nothing I haven't been able to get to work, but with tricks or adjustments.
I think the "best bonus" is using LLM's in deep research mode to wade through all the blog post, reddit posts etc to get something to work by discovering forementioned tricks. Before, you had to do that by yourself and it sucked. Now I get 3 good ideas from Claude in "ranking order" of how likely it is to make it work => 99% of games I get to run in 5 minutes with a shell command or two. Lutris is also pretty good.
Omarchy on my laptop has finally made computers fun for me again, it's so great and nostalgic. Happy to be back after my brief work-mandated adventure into MacOS.
However, despite really, really wanting to switch (and having it installed on my laptop), I keep finding things that don't quite work right that are preventing me from switching some of my machines. My living room PC, which is what my TV is connected to, the DVR software that runs my TV tuner card doesn't quite work right (despite having a native linux installer), and I couldn't get channels to come through as clearly and as easily. I spent a couple of hours of troubleshooting and gave up.
My work PC needs to have the Dropbox app (which has a linux installer), but it also needs the "online-only" functionality so that I can see and browse the entire (very large) dropbox directory without needing to have it all stored locally. This has been a feature that has been being requested on the linux version of the app for years, and dropbox appears unlikely to add it anytime soon.
Both of these are pretty niche issues that I don't expect to affect the vast majority of users (and the dropbox one in particular shouldn't be an issue at all if my org didn't insist on using dropbox in a way that it is very much not intended to be used, and for which better solutions exist, but I have given up on that fight a long time ago), and like I said, I've had linux on my laptop for a couple of years so far without any issue, and I love it.
I am curious how many "edge cases" like mine exist out there though. Maybe there exists some such edge case for a lot of people even while almost no one has the same edge case issue.
After a particularly busy OSS event a non-programmer friend of mine asked me, why is it that the Linux people seem to be so needy for everyone to make the same choices they make? trying to answer that question changed my perspective on the entire community. And here we are, after all these years the same question seems to still apply.
Why are we so needy for ALL users and use-cases to be Linux-based and Linux-centric once we make that choice ourselves? What is it about Linux? the BSD people seem to not suffer from this and I've never heard anyone advocate for migration to OSX in spite of it being superior for specific usecases (like music production).
IMO if you're a creator, operating systems are tools; use the tool that fits the task.
Ubuntu seems to be slowly getting worse.
- Firefox seems to be able to freeze both itself and, sometimes, the whole system. Usually while typing text into a large text box.
- Recently, printing didn't work for two days. Some pushed update installed a version of the CUPS daemon which reported a syntax error on the cupsd.conf file. A few days later, the problem went away, after much discussion on forums about workarounds.
- Can't use more than half of memory before the OOM killer kicks in. The default rule of the OOM killer daemon is that if a process has more than half of memory for a minute, kill it. Rust builds get killed. Firefox gets killed. This is a huge pain on the 8GB machine. Yes, I could edit some config file and stop this, but that tends to interfere with config file updates from Ubuntu and from the GUI tools.
None of these problems existed a year ago.
E.g three weeks ago nvidia pushed bad drivers which broke my desktop after a reboot and I had to swap display (ctrl-alt-f3 etc), I never got into gnome at all, and roll back to an earlier version. Automatic rollback of bad drivers would have saved this.
Are Radeon drivers less shit?
If I have an issue with an application or if I want an application, I must use the terminal. I can't imagine a Mac user bothering to learn it. Linux is for people who want to maximize the use of their computer without being spied on and without weird background processes. Linux won't die, but it won't catch Windows or Mac in the next 5 decades. People are too lazy for it. Forget about learning. I bet you $100, 99% of the people in the street didn't even see Linux in their lives, nor even heard of it. It is not because of marketing, it is because people who tried it returned to Windows or Mac after deciding it is too hard to learn for them to install a driver or an application.
One big plus with Linux, it's more amenable to AI assistance - just copy & paste shell commands, rather than follow GUI step-by-steps. And Linux has been in the world long enough to be deeply in the LLM training corpuses.
The Linux world is amazing for its experimentation and collaboration. But the fragmentation makes it hard for even technical people like me who just want to get work done to embrace it for the desktop.
Ubuntu LTS is probably the right choice. But it's just one more thing I have to go research.
I've installed Windows on all the PCs I've built for home and work over the last 20 years or so, until my latest in October. It was the ads in the lock screen that pushed me over the edge. Why should I pay for a license for that? Double-dipping fools. Am happily running Bazzite now.
Further, because of Linux’s security model, kernel level anti-cheats are basically impossible. For those of us who hate cheating, I don’t play online games anymore without kernel level anti-cheats. They’re not perfect, but they’re much better than anything available on Linux.
Further, I use a Fanatec racing wheel. Most peripherals just aren’t supported on Linux. It’s chicken and egg, and hardware manufacturers aren’t going to bother until we teach critical mass. This is decades away still.
I've been using Linux full-time (no other OSes at all) for nearly 20 years. Went through all my university education using only Linux. It's problem free if you use it like a grandma would (don't mess with the base system) and even if you mess with it, most things are easily reversible.
That being said, I have noticed that the newfound interest in Linux seems to be a result of big tech being SO abusive towards its customers that even normies are suddenly into computing "freedom".
It's a slow moving evergreen topic perfect for a scheduled release while the author is on holiday. This is just filler content that could have been written at any point in the last 10 years with minor changes.
First time I switched to asus kernel from the generic one was magic - I know asus-linux exists and following the instructions probably would have ended up in a working system, but with bazzite I wrote only one command and everything worked. It still feels weird not to monkey around with package installations (and this was a dangerous path, usually ended up with more work for me) but this is a tradeoff I can live with. The software I used - luckily - already moved to Flatpak so everything was a breeze. Also the fact that I can switch to a working state with one keypress is a stress reliever.
I agree. Linux is good now - for the common user. I still can't see immutable distros can be used for all scenarios but for gaming/home use, this is a methodology I can easily recommend for my friends and family who only want a computer that works without messing with console.
One thing that can be annoying is how quickly things have moved in the Linux gaming space over the past 5 years. I have been a part of conversations with coworkers who talk about how Linux gaming was in 2019 or 2020. I feel like anyone familiar with Linux will know the feeling of how quickly things can improve while documentation and public information cannot keep up.
Windows 7 was nice, but since then, I struggle to think of anything. Since 7, it feels like it's just been upsell after upsell, features that are about Microsoft selling me on something or annoying the crap out of me rather than providing actual utility.
I fully switched to GNU/Linux back then and have never looked back. Initially I was quite evangelical but got tired of it and gave up probably around 10 years ago thinking "oh well, their loss". But slowly more and more of the world has switched over, first servers, then developer workstations and now finally just "normal" users.
Similarly, I've always been hugely invested into my tools and have a strong distaste for manual labour. I often watch how others work and can't believe how slow and inefficient it is. Typing up repetitive syntax every time, copy/pasting huge blocks of code, performing repetitive actions when booting their PC etc. I simply haven't been doing this for my whole career, I've been writing scripts, using clever editors, using programming languages to their fullest etc.
I think this is why LLMs don't seem like such the huge breakthrough to me as they do to others. I wasn't doing this stuff manually before, that's ridiculous. I don't need to generate globs of code because I already know how to get the computer to do it for me, and I know how to do that in a sustainable and maintainable way too. It's sad that LLMs are giving people their first real sense of control, when it's actually been available for a very long by now, and in a way that you can actually own it, rather than paying for a service that might be taken away at any moment.
And it mostly works! At least for my games library. The only game I wasn't able to get to work so far is Space Marine 2, but on ProtonDB people report they got it to work.
As for the rest: I've been an exclusive Linux user on the desktop for ~20 years now, no regrets.
So far all the games I want to play run really well, with no noticable performance difference. If anything, they feel faster, but it could be placebo because the DE is more responsive.
I tried Cinnamon and while it was pleasantly customizable, the sigle-threadedness of the UI killed it for me. It was too easy to do the wrong thing and lock the UI thread, including several desktop or tray Spices from the official repo.
I'm switching to KDE. Seems peppier.
Biggest hardware challenge I've faced is my Logitech mouse, which is a huge jump from the old days of fighting with Wi-Fi and sound support. Sound is a bit messy with giving a plethora of audio devices that would be hidden under windows (like digital and analog options for each device) and occasionally compatibility for digital vs analog will be flaky from a game or something, but I'll take it.
Biggest hassle imho is still installing non-repo software. So many packages offer a flatpak and a snap and and build-from-source instructions where you have to figure out the local package names for each dependency and they offer one .Deb for each different version of Debian and its derivatives and it's just so tedious to figure which is the right one.
Ubuntu’s default desktop felt unstable in a macOS VM. Dual-booting on a couple of HP laptops slowed to a crawl after installing a few desktop apps, apparently because they pulled in background services. What surprised me was how quickly the system became unpleasant to use without any obvious “you just broke X” moment.
My current guess: not Linux in general, but heavy defaults (GNOME, Snap, systemd timers), desktop apps dragging in daemons, and OEM firmware / power-management quirks that don’t play well with Linux. Server Linux holds up because everything stays explicit. Desktop distros hide complexity and don’t give much visibility when things start to rot.
Does this line up with others’ experience? If yes, what actually works long-term? Minimal bases, immutable distros, avoiding certain package systems, strict service hygiene, specific hardware?
Oh, and also anti-cheat games forcing me to use Windows. Makes me sick to my stomach booting into Windows 11 every couple of months and having to watch my PC performance tank while it's downloading updates, Windows Defender scans, etc. for 30 minutes
I opted to install Linux in a VM under Hyper-V on Windows to avoid hassles with the dual GPUs in my ThinkPad P52, but this comes with several other hassles I'd like to avoid. (Like no GPU access in Linux at all...)
Linux still suffer from the same fragmentation issue: Oh you want to play game, you should use distro X, oh you want an average web-browsing, working, you should use distro Y, or for programming, use Z. Of course all of them can do what other can do, but the community decided that the way it is.
Yesterday i read a reddit thread about an user sharing his issue with pop-os, and most(if not all) comments saying he is using the wrong distro. He is using latest release (not the nightly build), which is a reasonably thing to do as new user.
Not sure if Linux Mint has changed this, but i remember having to add "non-free" repo to use official Nvidia driver. Not a big deal to people who know what they are doing, but still, that is unnecessary firction.
So I installed Fedora on my work machine and find that I can still get all of my work done. Well except the parts that require testing accessibility on Windows screen readers or helping with Windows-related issues.
The only thing I miss now are the many addons made for NVDA, especially the ones for image descriptions. But if I can get something to work with Wayland, I could probably vibe code some of them. Thank goodness for Claude Code.
The success measurements are quite strange. How am I supposed to think Linux is finally good when 96.8% of users do not care to adopt it. I can't think of anything else with that high of a rejection rate. The vast majority do not consider it good enough to use over Windows.
Can I get a laptop to sleep after closing the lid yet?
Not that long ago the answer to these questions was mostly no (or sort of yes... but very painfully)
On Windows all of this just works.
On one hand we have Steam that will make 1000s of games become available on easy to use platform based on Arch.
For developers, we have Omarchy, which makes experience much more streamlined and very pleasant and productive. I moved both my desktop and laptop to Omarchy and have one Mac laptop, this is really good experience, not everything is perfect, but when I switch to Mac after Omarchy, I often discover how not easy is to use Mac, how many clicks it takes to do something simple.
I think both Microsoft and Apple need some serious competition and again, came from Arch who turned out to be more stable and serious then Ubuntu.
People dual boot SSD OS for very good reasons, as kernel permutation is not necessarily progress in FOSS. Linux is usable by regular users these days, but "Good" is relative to your use case. YMMV =3
The ONLY thing I'm still having trouble with under Linux is Steam VR on the HTC Vive. It works. Barely.
People are gonna be forced to leave it more and more unless they make a drastic turn about
I can't move until the software I've invested in moves too. There are no Linux alternatives.
Wayland spent a decade to be mostly usable with rough edges, Flatpak sandbox is really rough and most things are designed by amateurs.
Still Windows destroying itself made the gap closer than ever, right now is a great chance to gain market share and funding and professionals.
What Linux distros are recommended for gaming? All my games are on Steam. Dual monitor setup with game on main screen and browser on the second one.
Windows for gaming, Ubuntu as desktop default, Arch Linux on Laptop, MacOS on the other Laptop.
Games are not the problem, but that one game i want to play on a saturday evening when i have time, is the problem. That one i haven't tried out yet.
I think it will probably change at some point, but until then I just can't use it.
Despite this, Linux as ecosystem has numerous problems. The "wayland is the future" annoys me a lot. The wayland protocol was released in 2008. Now it is almost 20 years. I don't feel wayland is ever going to win a "linux desktop of the year" award. Things that work on xorg-server still do not work on wayland - and probably never will. I am not saying wayland is useless, I ran it for a while on KDE (though I actually don't use KDE, I use icewm typically), but it is just annoying how important things like GUI on Linux simply suck. In many ways Linux is kind of a server computer system, not really a desktop computer system. I use it as one, but the design philosophy is much more catering to the server or compute-work station objective.
Also, GTK ... this thing keeps on getting worse and worse with every new release. I have no idea what they are doing, but I have an old GTK2-based editor and this one consistently works better than the GTK3 or GTK4 ported version. It was a huge mistake to downgrade and nerf GTK to a gnomey-toolkit only. Don't even get me started on GNOME ...
Linux is not good. Some hardware support is still reverse-engineering-based, or based on a few individuals best effort activity. Linux needs manufacturers' first hand commitment to quality opensource to be truly good.
Linux is not good. Some software is not on feature-parity among operating systems. With Linux being the software kingdom poor Cinderella. Linux needs software feature parity to be truly good.
Linux is not good. Because too much mainstream new PCs comes with some other operating system pre-installed (and paid for) even if you won't need it. Linux needs freedom of choice sing first PC power on, like a stub to download whatever OS you want (to pay for) or to boot from removable media for Linux to become truly good.
Linux is not good. Because there is still "stuff" that require some specific non-linux software running under some specific non-linux operating system to be made useful things. We need manufacturers to ditch this for linux to become truly good.
I am a happy user of Linux on my primary PC since 20+ years now. But I still have to fight for my freedom every now and again because of one or more of the above points.
Now let's jump back to gaming.
Linux is not good because game industry thinks proprietary platforms and operating systems are better for their business. There is only 1 platform fully supporting Linux and too few titles. Gaming Linux hardly hits 5% of the market share, basically the same as Desktop Linux. While Server Linux is beyond 75%.
I think reasons could be two-fold.
On one hand, Linux is not perceived by industry as attractive as other proprietary platforms. Maybe industry can squeeze much more money from the latter.
On the other hand, it could be that most of the development resources are NOT ORIENTED towards gaming and desktop, so these markets simply lag behind.
Of course, I could be totally wrong: these are my humble opinions with some facts backing them.
Live Long, Linux!
Why are you holding computing to the same standard?
I'm curious as a non-gamer. The article seems to be entirely about Windows gaming.
IMO the next important unblocker for Linux adoption is the Adobe suite. In a post-mobile world one can use a tablet or phone for almost any media consumption. But production is still in the realm of the desktop UX and photo/video/creative work is the most common form of output. An Adobe CC Linux option would enable that set of "power users". And regardless of their actual percentage of desktop users, just about ever YouTuber or streamer talking about technology is by definition a content creator so opening Linux up to them would have a big effect on adoption.
And yes I've tried most of the Linux alternatives, like GIMP, Inkscape, DaVinci, RawTherapee, etc. They're mostly /fine/ but it's one of the weaker software categories in FOSS-alternatives IMO. It also adds an unnecessary learning curve. Gamers would laugh if they were told that Linux gaming was great, they just have to learn and play an entirely different set of games.
I've used Mint in the past, loved it until I spent a day trying to get scanner drivers to work. Don't know if that's changed now, was 4 years ago
Yes, you can get this stuff working, but if you enjoy doing other things in life, have a job and don’t life alone, it is SSSOOOOO much easier to get a Mac mini. Or even windows 11 if that’s your thing.
It's funny they would choose this phasing.
This is exactly the way I described my decision to abandon windoze, and switch to linux, over 20 years ago...
We've reached a point where Microsoft greed and carelessness is degrading Windows from all angles. With the constant forced Copilot, forced sign-ups, annoying pop-ups and ads, it is figuratively unusable; in the case of machines stuck on Windows 10 it is literally unusable.
They are now banking entirely on a captive market of Enterprise customers who have invested too much to quit. The enshittification is feature complete.
The only sane ways: either a 'correct' set of native elf/linux binaries, or proton = 0 bucks (namely only free-to-play, with 0 cent in any micro-transactions).
And if you are running Chrome, and something starts taking a lot of memory, say goodbye to the entire app without any niceties.
(Yes, this is a mere pet peeve but it has been causing me so much pain over the past year, and it's such an inferior way to deal with memory limits tha what came before it, I don't know why anybody would have taken OOM logic from systemd services and applied it to use launched processes.)
You know this is a psyop when they can't resist mentioning Microsoft's product
horrendus ux
Not up close due to the vast number of inconsistencies.
This could only be fixed by a user experience built from the ground up by a single company.
We need to address the problems rather than pretending it's already great
Shutting down your laptop and having to wait five minutes for systemd to shutdown because of some timeout when you need to get your flight is just one of those reasons you end up going back to windows
I also play a decent amount of Flight Simulator 2024 and losing that is almost a non-starter for switching.
1. Look at commercial desktop OSes (Windows, MacOS). They spend hundreds of millions to develop and maintain the OS, do updates, quality assurance testing, working with hundreds of thousands of hardware vendors and enterprises, etc, just to try to not break things constantly. This is with "an ecosystem" that is one stack developed by one company. And even they can't get it right. Several Linux-Desktop companies have tried to mimic the commercial companies by not only custom-tailoring their own stack and doing QA, but sometimes even partnering on certified hardware. They're spending serious cash to try to do it right. But still there's plenty of bugs (go look at their issue trackers, community forums, package updates, etc) and no significant benefit over the competition.
2. There is no incentive for Linux to have consistency, quality, or a good UX. The incentive is to create free software that a developer wants. The entire ethos of the OSS community is, and has always been, I want the software, I make the software, you're welcome to use it too. And that's fine, for developers! But that's not how you make something regular people can use reliably and enjoyably. It's a hodge-podge of different solutions glued together. Which works up to a point, but then...
3. Eventually Linux desktop reaches a point where it doesn't work. The new mouse you bought's extra buttons don't work. Or the expensive webcam you bought can't be controlled because it requires a custom app only shipped on Windows/Mac. Or your graphics card's vendor uses proprietary firmware blobs causing bugs on only Linux for unknown reasons. Or your speakers sound like crap because they need a custom firmware blob loaded by the commercial OSes. Or your touchscreen can't be enabled/disabled because Wayland doesn't support the X extensions that used to allow that to work with xrandr. Or your need to look up obscure bootloader flags, edit the bootloader, and restart, to enable/disable some obscure fix for your hardware (lcd low power settings, acpi, disk controller, or any of a thousand other issues). Or, quite simply, the software you try to install just doesn't work; random errors are popping up, things are not working, and you don't know why. In any of these cases, your only hope is... to go on Reddit and ask for help from strangers. There's no customer support line. Your ISP ain't gonna help you. The Geek Squad just shrugs. You are on your own.
And this is the most frustrating part... the extremely nerdy core fan-group, like those on HN or Reddit, who are lucky enough not to be experiencing the problems unique to Linux, gaslight you and tell you all your problems are imagined or your fault.
I switched full time to Linux in 2022 when Elden Ring launched and had better performance in its first week on Linux than on Windows. I personally switched over to KDE Plasma powered by Arch. The first thing I noticed was how perfectly it was immediately. So I'd push the title of the article on step further: Linux has been good for a while.
In the four years since switching over from a dual Windows (for gaming) and Mac (for programming, web browsing, and everything else), I have almost entirely had a better experience in every single area. I still use macOS daily for work, and it is constantly driving me mad. For every task I have thrown at it (from gaming to programming to game dev to photo editing), Linux just works.
On Mac it's more like, "there's an app for that". I have third party package managers on Mac. I use a third party app to display if my internet connection is using Ethernet. It yells at me to delete the CSV file that I created and requires an instruction manual with instructions for the Settings app that have changed three times in three years for how to open the file, add Bluetooth to the menu bar, etc. It even had a permanent red icon on the Settings about not being signed into an Apple ID. And once I signed in, the Settings app has a permanent red icon about paying for Apple Care. My parents have made comments about how they're worried as they get older that they won't be able to keep up with the constant updates and changes to macOS and iOS.
I don't have much to say about Windows besides good riddance. It was far less confusing to use than macOS but was filled with too much bloat and pop up notifications.
The final thing I'll mention is that the first time my girlfriend used my computer, she sat down, opened the browser, and completed her task. She thought that she was using Windows and was able to navigate the new interface without having to spend any time learning anything. For her regular use case of using the PC for an internet browser, Linux just worked. She even asked me afterwards to install it on her laptop to replace Windows! I can't believe we're in a world where that's asked by someone non-technical who just wants a computer to get out of their way so that they can perform their tasks.
As many have pointed out, The biggest factor is obviously the enshittification of Microsoft. Valve has crept up in gaming. And I think understated is how incredibly nice the tiling WMs are. They really do offer an experience which is impossible to replicate on Mac or Windows, both aesthetically and functionally.
Linux, I think, rewards the power user. Microsoft and Apple couldn't give a crap about their power users. Apple has seemed to devolve into "Name That Product Line" fanboy fantasy land and has lost all but the most diehard fans. Microsoft is just outright hostile.
I'm interested to see what direction app development goes in. I think TUIs will continue to rise in popularity. They are snappier and overall a much better experience. In addition, they work over SSH. There is now an entire overclass of power users who are very comfortable moving around in different servers in shell. I don't think people are going to want to settle for AI SaaS Cloudslop after they get a taste of local first, and when they realize that running a homelab is basically just Linux, I think all bets are off as far as which direction "personal computing" goes. Also standing firmly in the way of total SSH app freedom are IPhone and Android, which keep pushing that almost tangible utopia of amazing software frustratingly far out of reach.
It doesn't seem like there is a clear winner for the noob-friendly distro category. It seems like theyre all pretty good. The gaming distros seem really effective. I finally installed Omarchy, having thought "I didn't need it, I can rice my own arch", etc, and I must say the experience has been wonderful.
I'm pretty comfortable at the "cutting edge" (read, with all my stuff being broken), so my own tastes in OS have moved from Arch to the systemd free Artix or OpenBSD. I don't really see the more traditional "advanced" Linuxes like Slackware or Gentoo pulling much weight. I've heard interesting things about users building declarative Nix environments and I think that's an interesting path. Personally, I hope we see some new, non-Unix operating systems that are more data and database oriented than file oriented. For now, OpenBSD feels very comfortable, it feels like I have a prayer of understanding what's on my system and that I learn things by using it, the latter of which is a feature of Arch. The emphasis on clean and concise code is really quite good, and serves as a good reminder that for all the "memory safe" features of these new languages, it's tough to beat truly great C developers for code quality. If you're going to stick with Unix, you might as well go for the best.
More and more I find myself wanting to integrate "personal computing" into my workflow, whether that's apps made for me and me alone, Emacs lisp, custom vim plugins, or homelab stuff. I look with envy at the smalltalks of the world, like Marvelous Toolkit, the Forths, or the Clojure based Easel. I really crave fluency - the ability for code to just pour out - none of the hesitation or system knowledge gaps which come from Stack Overflow or LLM use. I want mastery. I've also become much more tactical on which code I want to maintain. I really have tried to curb "not invented here" syndrome because eventually you realize you aren't going to be able to maintain it all. Really I just want a fun programming environment where I can read Don Knuth and make wireframe graphics demos!
Instead of distro upgrades, spend 3 minutes disabling the newest AI feature using regedit.
But, as the author rightly notes: It's more about a "feeling." Well then, good luck.
Please revert this submission to use the correct title.