Also, games went from writing the most cleverly optimized code you've ever seen to squeeze every last drop of compute power out of a 6502 CPU all while fitting on a ROM cartridge to not giving a single shit about any sort of efficiency, blowing up the install size with unused and duplicated assets, and literally making fun of anyone without the latest highest end computer for being poor.
Ah, back when game development was managed by game developers who were gamers themselves and prioritized quality over min-maxing shareholder profits...
Or another way to look at it, is that it was the market takeover phase of capitalism where capitalists are willing to operate at a loss to corner the market and create their own monopolies (see Nintendo, Google, Facebook, Amazon, etc). But once market grow stalls out they switch to the milking phase or enshitification phase of capitalism where they prioritize profits over everything else
I know it's a one-of-a-kind game, but it still amazes me that Roller Coaster Tycoon released in 1999, a game where you could have hundreds of NPCs on screen at a time, unique events and sound effects for each of those NPCs, physics simulations of roller coasters and rides, terrain manipulation, and it was all runnable on pretty basic hardware at that time. Today's AAA games could never. I'm glad some indie games are still carrying the torch for small, efficient games that people can play on any hardware though.
It's a different world now though. I could go into detail of the differences, but suffice to say you cannot compare them.
Having said that, Windows lately seems to just be slow on very modern systems for no reason I can ascertain.
I swapped back to Linux as primary os a few weeks ago and it's just so snappy in terms of ui responsiveness. It's not better in every way. But for sure I never sit waiting for windows to decide to show me the context menu for an item in explorer.
Anyway in short, the main reason for the difference with old and new computer systems is the necessary abstraction.
That's complete nonsense I'm afraid. While abstractions are necessary, the bloat of modern software absolutely isn't. A lot of the bloat isn't fundamental, but a result of things growing through accretion, and people papering over legacy designs instead of starting fresh.
The selection pressures of the industry do not favor efficiency. Software developers are able to write inefficient software and rely on hardware getting faster. Meanwhile, hardware manufacturers benefit from bloated software because it creates demand for new hardware.
Phones are a perfect example of this in action. Most of the essential apps on the phone haven't changed in any significant way in over a decade. Yet, they continue getting less and less performant without any visible benefit for the user. Imagine if instead, hardware stayed the same and people focused on optimizing software to be more efficient over the past decade.
Except it's not nonsense. I've worked in development through both eras. You need to develop in an abstracted way because there are so many variations on hardware to deal with.
There is bloating for sure, and of course. A lot is because it's usually much better to use an existing library than reinvent the wheel. And the library needs to cover many other use cases than your own. I encountered this myself, where I used a Web library to work with releases on forgejo, had it working generally, but then saw there was a library for it. The boilerplate to make the library work was more than I did to just make the Web requests.
But that's mostly size. The bloat in terms of speed is mostly in the operating system I think and hardware abstraction. Not libraries by and large.
I'm also going to say legacy systems being papered over doesn't always make things slower. Where I work, I've worked on our legacy system for decades. But on the current product for probably the past 5-10. We still sell both. The legacy system is not the slower system.
Completely unconnected to OP, but oh fuck do I hate that Microsoft Excel couldn't open two documents side by side before like 2017. They all opened in one instance of the app unless you launch another as an admin, and it even screamed at you that it can't open files with the same name. W?T?F?
With 16gb of RAM and 102% CPU, the computer shows you a UI on any underlying hardware, any monitor/tv/whatever, handles a moise, keyboard, sound, handles any hardware interruption, probably fetches and sends stuff to the internet, scans your disk to index files so you can search almost instantly through gigabytes of storage whether it's USB sticks, ssds, harddrive, nvme drive. And probably a lot of other stuff I'm forgetting.
Meanwhile the other thingy with 4kb ram did college math problems. Impressive for the time yes, but that's it.
Yes, nowadays there is a lot of inefficiency, but that comparison does not, and never did, make sense.
You are being practical. I would say the fair amount of RAM in usage achieving all those tasks is 512MB. Just checked my Gentoo box with XFCE and Bluetooth & PulseAudio crap running, no tuning, merely 700MB of RAM in use.
Sure, then you can start libreoffice calc and go up to around 1g of ram and close to 0% CPU anyway.
My point wasn't on exact numbers because obviously the ones in the image are made up, unless that excel file is a monster of macros, VBA scripts and connections to numerous data lakes.
We had most of this with Windows 7 and probably XP as well. Those used a fraction of the RAM, disk space, and CPU time for largely the same effect as today.