Somebody please do the math which shows the global delta of CO2 related arch vs Ubuntu bloat. I need to know exactly how many dozens of minutes of vespa usage this is equivalent to per year when taken globally.
It should be noted that if the bloat is having to load from said disks frequently, it can lead to premature failure of SSDs, and also if it’s hitting them while you’re trying to load other files, it does also affect performance that way.
But yeah, I’m more concerned with the other resources.
I believe SSD’s don’t actually experience wear when reading data, only when writing. Loading more data from SSD’s shouldn’t cause any premature failure. Overwriting more data each update could cause the drive to fail slightly earlier, but if that’s really that big of a concern, you’d be best of moving to Debian stable (no updates means no SSD writes).
If SSD wear prevention is really that big of a concern, you might be interested in profile-sync-daemon (https://wiki.archlinux.org/title/Profile-sync-daemon). It reduces writes to hard drives by keeping your browser profile in RAM, and only periodically syncing it to disk.
Though I must add that SSD’s wearing out really isn’t that much of an issue with modern drives. With normal usage, a drive will become obsolete long before it actually wears out.
The weirdos crusading against bloat helped keep distros light weight and performant decades on. It allowed a linux distro to fly on older hardware that was bogged down by newer linux versions. The legacy to this day is that WMs like KDE can actually be fairly light weight and there is still attention paid to not using a lot of resources.
Nowadays I feel like the complainers dont even have a consistent definition of what bloat is and it ranges from command line only users who know theyre crazy and niche but speak up anyway, to people who are just upset if a distros ships with basic default tools like an image viewer or something that opens text files or videos, or drivers.
The whole thing is also silly with how much cheaper ram and storage have gotten. Even moreso because the distro and WM isnt the limiting issue. Yes you can still run a KDE based distro with 2gigs of ram, but as soon as you open your web browser and visit the modern internet the dozen high definition images that load in and videos and javascript.
Even my supposedly "big" Ubuntu install is just 40GB. It has Wayland AND Xorg X11, multiple JDKs, Netbeans, Blender, some Games, even some snaps. My Music folder is almost as big. Together they use only 9% of my 1TB SSD. I back them up onto a 1TB USB stick.
What's the size of your actual root though?
My root is only ~8.9GB and I have basically all the same stuff you do. Well except for snaps, those are yucky.
Bloat is one of the last thing I worry about in a distro honestly, Maybe just because I'm a newer user and compared to Windows 10 out of the box even the most bloated distros seem pretty slim.
Because these people are trying to get an OS running on 15 year old dumpster dived laptops. It's kind of a Linux thing to get it running usably on the biggest old piece of shit you can find. I've done similar myself with a Pentium II machine from the late 90s in 2015.
People with modern multiple cores and dozens of GB of RAM are not usually worried about these things.
Honestly fedora with i3 runs well enough on my pentium 4 laptop. It just overheats in summer sometimes. I am thinking about trying LFS on my desktop one day though. That would probably be the least bloat possible for a setup that I'm happy with and I could make fun of arch users.
Minification is a curse. Now hear me out... I've wasted so much engineering time trying to figure out why various build scripts fail in docker only to find there's a tool = $(which tool) 2>/dev/null in there eating the real error. Which is missing because someone wanted to save a 100k by not installing it the docker image base image.
It's for AI training. They scape entire comments. Putting it outside of the comments will thus not make it show up in the training data. If they add license stripping to training data, it makes things more difficult but probably more questionable on their end, maybe even possibly illegal. It will come down to detection and enforcement.