Enough with the fan wars. Let's be perfectly honest for once. Windows, Linux, MacOS - they all suck. Sometimes in similar ways, sometimes in different ways. But they all suck.
Windows users - I get you, you use it because it sorta works 40%, of the time and sucks in the way you understand.
Linux users - I get you, you know all of the arcane incantations you need to quickly install, update, and troubleshoot your os in a terminal window. It works - once you apply your custom bash script that applies every change you need to get everything exactly how you like it. But again, it sucks in the way you understand.
MacOS users - well I don't really get you. You know what you've done.
We deserve better than this, guys. We deserve an os that just works, is easy to use, easy to configure, doesn't require an IT degree to use, and that we can recommend to our grandma without a second thought.
I've worked exclusively with Linux servers since 2002 and exclusively Linux desktop since 2004 and I've come to the point where I prettyuch refuse to touch windows for fear it will infect me somehow.
I know most people don't know any better but it's insanity to me that anyone still pays money for windows. It's a scam, no other words for it.
Don't even get me started on Windows servers. It's just sad to see how much money is spent on a company that has so litte focus on quality.
Even the online services suck. Dear God Microsoft, would it kill you to understand that people might have gasp TWO tabs open with your teams "app"?
Windows requirements: sprawling list of unsupported hardware based on an arbitrary requirment for a security chip that doesn't actually improve security at all
I like Linux a lot, but saying you can't understand why someone would run Windows on a server just shows a lack of knowledge. Linux is great in a lot of server applications in the application realm. However, it doesn't get close to the power of Active Directory and Group Policy for Windows device management. Besides that, a lot of people are more comfortable with a UI for managing DHCP, DNA, etc in a SMB environment. Even if they prefer a command line for those tools PowerShell allows those people to coexist with those that prefer a GUI. Under certain circumstances, (mainly ones where a business is forgoing AD for AAD), Linux can be the right choice. Pretending that there's no place for Windows Server, though, is asinine.
There's this thing I notice. If windows asks you to learn something or put up with some BS it's seen as the cost of business, reasonable, or simply not even noticed. If Linux requires you to learn something, like read one article about which distro might work best for you, it's seen as an insurmountable difficulty or an absurd ask.
I upgraded my Intel system to AMD today. And I didn't have to reinstall a damn thing, because my existing Linux installation Just Worked™. It really is to the point that I could never imagine going back to Windows.
I was flirting with Linux for 20 years. There was always something that put me off an I went back to Windows. Recently I installed ubuntu with Kde plasma and I'm not going back. It just works and is heaps faster on older hardware. The old driver issues are gone, compatibility is awesome. The only issue is getting used to new software names.
You know, I've been using Linux on desktops and laptops for like 20 years now. I can count on one hand then number of times I've had hardware support issues. Outside of a fingerprint scanner, I've been able to solve all of those issues.
Meanwhile, my adventures across the years dealing with Windows drivers led me to finally say "fuck it" earlier this year and nuke the Windows install on my gaming rig in favor of Nobara.
I'll take Linux hardware support over Microsoft any day of the week.
Linux does support more CPU architecture (x86 Arm PowerPC RISC) while Windows only support x86 and some Arm CPU so technically Linux support more CPU but Windows does support more GPU and Plug and Play devices (controller, external sound card...)
They have a point. I'm in the market for a new laptop and I have, so far, returned two of them.
First, I tried a Huawei Matebook 16. I was foolish, but I thought it was "easy". No NVidia, no dGPU at all - just part that looked very standard. It was based on the info I had gathered from a few years of Linux usage: "Basically avoid NVidia and you're good". It was anything but. Broken suspend, WiFi was horrible, random deadlocks, extreme slowness at times (as if the RYZEN 7 wasn't Ryzen 7-ing) to become less smooth than my 5 year old Intel laptop, and broken audio codec (Senary Audio) that didn't work at all on the live, and worked erratically on the installed system using generic hd-audio drivers.
I had a ~€1500 budget, but I raised it to buy a €1700 ThinkPad P16s AMD. No dGPU to speak of, sold with pre loaded Linux, boasting Canonical and Red Hat hardware certifications.
I had:
Broken standby on Linux
GPU bugs and screen flickering on Linux
Various hangs and crashed
Malfunctioning wifi and non working 6e mode. I dug, and apparently the soldered Wi-Fi adapter does not have any kind of Linux support at all, but the kernel uses a quirk to load the firmware of an older Qualcomm card that's kinda similar on it and get it to work in Wi-Fi 6 compatibility mode.
Boggles my mind that the 2 biggest enterprise Linux vendors took this laptop, ran a "thorough hardware certification process" on it and let it pass. Is this a pass? How long have they tried it? Have they even tried suspending?
Of course, that was a return. But when I think about new laptops and Windows 11, basically anything works. You don't have to pay attention to anything: suspend will work, WiFi will work, audio and speakers as well, if you need fractional scaling you aren't in for a world of pain, and if you want an NVidia dGPU, it does work.
Furthermore, the Windows 11 compatible CPU list is completely unofficial arbitrary, since you can still sideload Windows 11 on "unsupported" hardware and it will run with a far higher success rate than Linux on a random laptop you buy in store now. Like, it has been confirmed to run well on ancient Intel CPUs with screens below the minimum resolution. It's basically a skin over 10 and there are no significant kernel modifications.
To be clear: I don't like Windows, but I hate this post as a consumer of bleeding edge hardware because it hides the problem under the rug - most new hardware is Windows-centric, and Linux supported options are few and far between. Nowdays not even the manufacturer declaring Linux support is enough. This friend of mine got a Dell XPS 13 Plus Developer Edition, and if he uses ANY ISO except the default Dell-customized Ubuntu 20.04 audio doesn't work at all! And my other friend with a Dell XPS 13 Developer Edition has various GPU artifacts on the screen on anything except the relative Dell-customized Ubuntu 20.04 image. It's such a minefield.
I have effectively added €500 to my budget, to now reach an outrageous €2000 for a premium Linux laptop with no significant trade-offs (mostly, I want a good screen and good performance). I am considering taking a shot in the dark and pre ordering the Framework 16, effectively swaying from traditional laptop makers entirely and hoping a fully customized laptop by a company that has been long committed to Linux support will be different.
To be fair, Nvidia support on Linux has been historically quite poor, with users having to manually install drivers (something the average person shouldn't have to think about). Though even that has gotten much better recently, with Debian now allowing forks to have proprietary drivers built in.
More important IMO is the fact that Linux re-detects hardware on every boot! Try moving a Windows hard drive to completely new hardware and getting it to boot. Not a chance...
Active directory and it's integration with services such as DNS and DHCP is pretty great though. I wish Microsoft started focusing less on cloud and improved the user (or rather admin) experience of their server tools, they are quite awful is some cases.
I know hardware compatibility has massively improved, but back when I was messing with Linux in high school compatibility was a huge issue.
I managed to end up with two laptops and some desktop hardware that were truly difficult to get running. It's like I somehow found a list of incompatible hardware and chose the worst options.
The most frustrating were an evil Broadcom (I think) wireless card and an AMD switchable card (they did actually make a few).
That graphics card wasn't supported for very long and was a bother even in Windows.
Edit to add: I was just saying that to point out why some people might have that opinion, even if it isn't valid anymore. I'm actually thinking of jumping back on the Linux bandwagon.
People say that Nvidia just doesn't work right on Linux. I'd never know that except for everyone saying it. My desktop has Nvidia and all Linux distro I've tried on it are like perfectly fine. Yes for gaming also.
The only real hardware problems I come across these days with Linux is WiFi cards being shit. As far as I'm concerned, carefully selecting hardware is a problem for the *BSDs at this point. Am I missing something?
I've used Linux on my private laptop for the past few years, never had any major issues. Work desktop is running Ubuntu, no major problems except for the odd bit of poorly maintained software (niche science things, so that's not really a Linux issue). Laptop breaks, I get a Windows 11 laptop from work...and I've had so many problems. Updates keep breaking everything, and I've had to do a factory reset more than once since the recovery after those updates also always failed. Wish I had my good old Linux laptop back :(
Windows 11 and its goddamn picky-ass CPU requirements... What the actual fuck, Microsoft? Did someone over there drink a tall glass of stupid juice and think, "Hey, let's royally piss off a chunk of our user base just because we can?" This is tech elitism at its absolute shittiest.
It’s like Microsoft's throwing a party, and instead of a guest list, they've got some half-baked, cockamamie CPU blacklist. "Oh, you're rocking a perfectly functional CPU from a few years ago? Tough titties! Go fuck yourself with a USB stick!"
This isn't progress; it’s goddamn techno-discrimination. It's like being invited to a buffet and then being told you can only eat if your fork is from the latest silverware collection. I mean, who's making these decisions over there? A drunk leprechaun playing darts with a list of CPUs?
Look, I get wanting to advance, to push the boundaries of what's possible. But this? It's like serving someone a gourmet meal and then punching them in the gut for not having the right kind of fucking taste buds.
Windows 11, with its bizarre-ass CPU criteria, is a masterclass in how to cock up a product launch. Dear Microsoft, next time you decide to drop a steaming turd of a decision on your users, at least have the decency to hand out some goddamn air fresheners, because this shit STINKS.
Isn't the CPU support reason solely specific to a new feature Windows 11 was going to use, and you can just use Windows 10 while it's still in support? Plus Windows 10 knows this and won't even try to update your PC to windows 11?
It's not a really strong argument when most hardware drivers are made with Windows in mind first, and maybe someone is going to write up a Linux driver if they're interested. I mean Linux went for years having to do some hack&slash solution to broadcom drivers until they were finally added in. That affected at least 2 laptops in my lifetime.
I will stop to say that currently, I think Linux is in a good spot. But you can't just pretend the issue absolutely doesn't exist because your specific setup works.
I have found Linux to have excellent HW support for all older hardware. Only notable exception is fingerprint readers. Granted, it's been years since I tried gaming.
I officially switched my desktop and server to Linux. If I could switch my work computer I would. I bought a MacBook Air recently because I didn’t know Linux laptops were getting so popular. But I like the Mac and can still do some Linux like stuff in the terminal.
I'd love to switch to Linux. I've used Linux off and on for almost two decades now. At one point I was triple booting Windows XP, Windows 7, and Fedora. The one thing holding me back is, strangely enough, game compatibility. I know Proton has made huge strides as I've seen it first hand on the Steam Deck, a lovely little machine. The problem is, I have a huge library, and while I'm okay with slightly less than ideal performance here and there on the Deck (40hz mode anyone?), I absolutely refuse to lose any performance due to running Linux. Benchmarks still show some titles losing 5-15% performance when running through Proton.
Don't get me wrong. I love FOSS. I donate and try to spread the word as much as I can when I find a passion project, and find it particularly useful. Even though this may seem to go against what I previously said, I'm debating on switching to Linux when Windows 10 loses support. I do not want to enable fTPM on my motherboard or update my BIOS if I don't have to. My PC is stable, no thank you. I feel like I'll have to troubleshoot whether I choose Linux or Windows 11. Ugh.
There's not a lot of things that stupid people can say, that would genuinely frustrate me, but when you make uneducated, factless statements, and then decide to fanboy about something in the same sentence, that genuinely frustrates me
I try using Linux on my desktop PC from time to time. Whenever I buy a new rig, I try Linux, as I want to reinstall the system anyway. It never worked. I always tried with brand new hardware -> something is not properly supported -> install current windows. Rinse and repeat every 4 or 5 years whenever I get my hand on a new desktop or laptop. That never changed for the last 20 years.
I'll probably transition my AMD 8350 build over to Linux when Win10 stops being supported. As opposed to my mom's FX-8370 build, which I'll probably just have to replace with a new Windows 11 system, as there's no way I'm expecting her (an elderly woman) to learn anything other than Windows. Especially since she's reliant on Windows-only apps.
The actual hardware she's using will probably be converted to a Linux Desktop, but I'll have to migrate her data to a new mini Windows 11 PC or something.
Linux is just all around snappier for me than windows is. I never have to wait, but on windows there are always delays opening windows and for some reason it will keep trying to generate thumbnails.
I really hate using windows. I’m a worse worker because of it. I’m just waiting for the m3 Macs to switch.
Sadly, my work stuff does not work on Linux. So I have a second computer for most of my work.
Well, I did have the issue of horrible range on my Qualcomm WiFI Drivers under linux leading to shitty WiFi range overall.
Eg. Laptop just below WiFi router and it shows 90% range.
I tried to daily Linux on my laptop but gave up because it didn't support the fingerprint reader or the speakers. Windows 11 drains the battery faster and feels sluggish more often.
Been meaning to transition to a distro with a focus on being pretty low maintainance yet not too top heavy, leaving Windows to a VM on my Proxmox server. Haven't gotten around to it yet, since I'd need to get the server a dedicated graphics card.
Could just use WINE, I suppose, but I'm assuming it's still as rough as it was last time I messed with it twelve years ago.
I use Windows for work and gaming, MacOS for app development (mostly because I can code for iOS and Android in one environment), and ChromeOS for my daily browsing.
I just enjoy how chrome always works when I need to just browse the internet or buy something online without issue.
Linux will never be anywhere close to plug and play for anything in the way Windows is, whether we're talking games, applications, AD, etc... At least not for a very, very long time. Windows has about 40 years of development and is tried and true by the masses worldwide. You don't have to be a master level 1337 h4xor to do anything in Windows, while you can't do about 70% of what you can do on Windows with Linux without being an advanced power user.
Linux is great for some stuff, but unless there's massive upgrades to where you can just hit "install" and something installs and works without fucking around in terminal, it will never see widespread adoption. Hell, half of my users can't even figure out how to use a goddamn Mac, and that's much more user friendly than any Linux kernel. You guys are delusional if you think otherwise.
Also, I've yet to see a single Linux kernel that is aesthetically pleasing on anywhere near the level of OSX or Windows 11... Or Windows 10... Or hell, 7, 8, and Vista lmao. Looks like a potato OS that was mocked up for some shitty low budget SyFy channel movie. Every single kernel I've ever seen. Even the ones that supposedly are "so nice looking bro I swear it looks better than 11 bro please why isn't anyone switching to Linux don't you guys want to learn a programming language to play games seriously bro it's so easy it just works bro broooo."