I wish Linux was more mature. Even as a systems and network administrator with 10+ years of experience working with both Linux and Windows in an enterprise environment, my private desktop Linux installs still occasionally bork themselves for no good reason and require a reinstall. Linux just doesn't like it when you do stuff with it.
The only thing borking my system is nvidia not keeping up with opensuse tumbleweed kernels.
But i haven't encountered such issuess on distros with fixed releases, such as debian or fedora. In my experience unless you modifiy system stuff it's very reliable.
My uptime is 60 days, and that's with running updates. In my experience, the people with the worst Linux experience are those who are skilled with Windows, because they keep trying to do things the Windows way.
Last time I used Windows as my OS was Windows 2000. I went through multiple things (BeOS, Suse Linux (I think before opensuse), rhl, FreeBSD, ubuntu...) until I landed on MacOS.
But all the bullshit Apple did to unify tablets with laptops and their lack of thorough with git, opengl, etc.. and all their problems with package distributions and their "appstore" made me switch back to Linux.
I searched for the most Linux friendly laptop on the market and bought a Thinkpad X1 Carbon.
Then spent the first month trying making my microphone work or my audio not crack by learning a ton of Alsa/Pulseaudio.
IMO Linux works well when you ace the hardware choice.
I thought Thinkpads work perfectly with Linux. I bought a pretty new Laptop from HP and it worked fine out of the box. Only issues were the battery life, which I fixed by installing auto-cpufreq (seems to work better than tlp), the fingerprint scanner because it uses a proprietary system instead of doing it the standart way and it doesn't detect when I turn around the display (it's one of those you can use like a tablet) even though it does deactivate the keyboard when I do that. Everything else works perfectly fine.
The last line is the condition to making linux work. Like hackintochs, its very hardware specific, and switching over to linux means an average user has to make concessions.
E.g for nvidia users, they have to conceed that some of their features normally available to them on windows will not work on linux, and get inferior driver support.
I second this. People usually recommend Ubuntu for beginners which I can somewhat understand because it's super easy to get started. But the downside is that you'll most likely stay a beginner and don't understand the absolute basics of a Linux based OS because, well, most of the time you don't have to. Then you make a beginner's mistake once and there you go.
I don't get why people even recommend Ubuntu anymore. There's other beginner friendly distros like Mint that don't have a company behind them that develops proprietary software no one wants and then tries to get everyone to use it.
That is not true. Getting into literally any Linux distro, unless it is a Fisher Price kiosk, allows you to learn plenty. My entry point was the legendary Ubuntu 16.04 LTS 6 years ago. If some elitist or horrible teacher or idiot told me to use Arch, Gentoo, or their favourite BS distro, I would have stayed a Windows NPC forever. Clearly, if you look at my knowledge, I would not have deserved it.
Ubuntu has the greatest community support, supports all kinds of install packages and has extremely good security and update support. It is the best entry point distro, and even Mint is harder than Ubuntu, since any distro that requires you to do anything undocumented or hardly documented is a turn off for beginner.
I have a Linux/Windows computing guide aimed at transitioning and/or dual boot usage with experience of almost 2 decades. https://lemmy.ml/post/511377
They are, if they recommend things to beginners without considering their perspective. Their own perspective on matters simply does not matter. This is why elitists and horrible teachers exist.
That's me spending 30 minutes trying to figure out how to change hotkeys in Windows, being told that I need to install an "application", realizing said application can moonlight as a keylogger so I end up uninstalling the whole thing and using proton/VMs instead.
Either that or requiring some esoteric registry changes that are gibberish but supposed to do what I'm looking for.
I must be lucky. I've been using Linux (Debian then Ubuntu then PC Linux OS then back to Kubuntu) since approx 2002. I don't remember ever having to reinstall my OS because an application borked on install or otherwise. Reboot, maybe, but it was normally fixable. I have been annoyed at my favorite apps disappearing in a new release and having to change my workflow, but that's about it.
Even all the pain I had to go through to get X11 working correctly in the early days didn't require reinstalls.
I've got a server running for 15 years straight with minimum changes beyond security patching.
For desktop though it can be a bag of mixed results: Casual users that I've convinced using Linux had been over the moon with it, their computers "just work".
Power users though, they have an incredibly hard time as they try matching functionalities with other OS but do not want to rely heavily on terminals and setting files.
The problem for this last group is that the desktop developers are mostly users, and they are comfortable with terminals.
In my own experience, the problems I had with desktop Linux are mostly drivers (spent a week learning how alsa/pulseaudio works).
My second, and most common problem is updates that break some functionality.
If I can detect it right away, no problem as I can revert it, but if it's something I only use occasionally, then I'll spend some quality time debugging.
I haven't tried Linux on desktop in years but I would like to explain why power users might prefer not learning the command line: they don't want to learn/memorize/understand the commands needed as that would take away from other things they want to spend that time on, I'm not sure why gui doesn't hold that friction but it doesn't
I might get lynched by my reply but coding a functionality for GUI rather than command line is way harder and more labor consuming as it adds an additional layer that is very very thin in a CLI.
We could blame the GTK vs. QT rivalry, but I think it's more of a user coding something they need and doing it the way its less work/more comfortable for them.
Consider that there's a wide range of Linux developers that prefer tiling desktops that only rely on keyboards, not mouse. Even, there's a Linux Window Manager called Ratpoison.
Again, if someone enjoys it or wants to do it that way, more power to them but if someone wants a common interface for most things without learning the specifics of something and the commands for it then they'll want a good gui, if it's not available they'll end up cribbing about it.
I do agree with you, if it didn't sound like it. But the problem IMO is lack of investment on GUI (notwithstanding all the amazing work the Plasma and Gnome team are doing).
If public entities moved their MS license money to buy Linux desktop OS support instead, that would probably solve this issue, while creating another 3 :).
This used to be me but mostly because I would experiment a little too much, never without reason.
Except a few Arch updates over a decade ago when they changed the default from hal to udev, or a Gentoo setup with WAY too specific USE flags, I don't think I can remember any failure like this ever. I've honestly had more issues with Windows nuking itself on a major update.
Mostly using Debian and Fedora these days, and it's been smooth sailing for quite some time.
how to tell you are using Arch without saying it. Don't use a rolling release on your own if you aren't willing to pay the maintenance cost.
edit: no, I'm not an ubuntu user.
my private desktop Linux installs still occasionally bork themselves for no good reason and require a reinstall
Edit: oh, you aren't even OP. But I see I triggered you. And you have repeated the same you are saying in the parallel comment? Are you here reading all comments to this specific comment?
I use Ubuntu 22.04 on my laptop and EndeavourOS (which is based on Arch) on my desktop. I spend WAY more time troubleshooting random shit on my Ubuntu machine compared to EndeavourOS.
I use EndeavourOS, which is Arch based, it has a great and very easy installer and it just worked after installing it and has worked ever since. Arch isn't that hard anymore.
What's the "maintenance cost"? Arch had a pretty big setup cost, but mostly because I wanted to configure it to my liking, but I haven't had to do any maintenance. My Arch server has had low setup time as well.