It's interesting how much technology has slowed down. Back in the 80s and 90s a 5 year old game looked horribly outdated. Now we're getting close to some 20 year old games still looking pretty decent.
Technology has slowed down, but there's also diminishing returns for what you can do with a game's graphics etc.
The original Halo ran at 480p on the Xbox. 4K UHD has 27 times the number of pixels as that. The resolution increase from the NES to Halo was about 5.35 times.
Games nowadays on PCs are often capable of running smoothly into the hundreds of frames per second, but of course for example the difference between 21 and 30 FPS is more noticeable than the one between 231 and 240 FPS. (Looking at you, OoT)
Render distances are much larger with less obvious compromise on LoD.
Stuff like ray-tracing is of some graphical benefit but is hugely computationally taxing, and there's nothing you can do about that. It's just more diminishing returns.
Physics engines are much more complex.
At some point, a limiting factor just becomes art direction and budget. You can have all the fancy techniques you want, but you still need to make detailed textures, animations, etc.
The amount of polygons starts to hit a ceiling too where the model is basically continuous to the human eye, so adding more polys might only help very subtly.
Color depth is basically a solved problem now too compared to going from the NES to the Xbox.
You can think of sampling audio. If I have a bit depth of 1, and I upgrade that to 16, it's going to sound a hell of a lot more like an improvement than if I were to upgrade from 48 to 64.
I think something worth noting about older games too is that they didn't try and deal with many of their limitations head on. In fact many actually took advantage of their limitations to give the feeling of doing more than they actually were. For example, pixel perfect verus crt. Many 8 bit and 16 bit games were designed specifically for televisions and monitors that would create the effect of having more complexity than they were actually capable of. Other things like clever layout designs in games to limit draw distance, or bringing that in as a functional aspect of the game.
The technical limitations seem largely resolved by current technology, where previously things were made to look and feel better than the hardware allowed through clever planning and engineering.
Oh, absolutely this. I think the YouTube channel GameHut is a great example of the lengths devs went to to get things working. In Ratchet & Clank 3, Insomniac borrowed memory from the PS2's second controller port to use for other things during single-player (PS2 devs did so much crazy shit that within the PCSX2 project, we often joke about how they "huffed glue"). The channel Retro Game Mechanics explained and the book "Racing the Beam" have great explanations for the lengths Atari devs had to go to just to do anything interesting with the system. Even into the seventh generation of consoles, the Hedgehog Engine had precomputed light sources as textures to trick your brain.
Heeeyyy buddy, wass up didn't expect to find you around here! And yeah. Rachet also has some ass backward stuff with The way it tries to force 60 FPS all the time which Ironically made it run worse in PCSX2 for the longest time till more accurate timings for the EE were found.
Oh shit, hey Beard. I didn't expect to see you here either. For that matter I didn't think anyone else surrounding the project used Lemmy. Cool to know I'm not alone.
Hell yeah! I think Kam might be around here somewhere but not a hundred percent on that.
Ofc, Rachet is a good example. But we all know the real insanity is Marvel Nemesis xD
At some point, a limiting factor just becomes art direction and budget. You can have all the fancy techniques you want, but you still need to make detailed textures, animations, etc.
Very possibly generative AI will alleviate this, although it has yet to produce convincing 3d models or animations.
It’s interesting how much technology has slowed down.
We haven't slowed down. We simply aren't noticing the degrees of progress, because they're increasingly below our scale of discernment. Going from 8-bit to 64-bit is more visually arresting than 1024-bit to 4096-bit. Moving the rendered horizon back another inch is less noticeable each time you do it, while requiring r^2 more processing power to process all those extra assets.
No we’re getting close to some 20 year old games still looking pretty decent.
The classic games look good because the art is polished and the direction is skilled. Go back and watch the original Star Wars movie and its going to be more visually acute than the latest Zack Snyder film. Not because movie graphics haven't improved in 40 years, but because Lucas was very good at his job while Synder isn't.
But then compare Avatar: The Way of Water to Tron. Huge improvements, in large part because Tron was trying to get outside the bounds of what was technically possible long before it was practical, while Avatar is taking computer generated graphics to their limit at a much later stage in their development.
yeah it's like with F1 racing you hit 99% of your min lap time but then it take a million dollars of R&D for each second reduction in min lap time after that.
Last time I was amazed with graphical progress was with Unreal in 1998. And probably just because I hadn't played Quake 2.
From then on until now it's just been a steady and normal increase in expected quality.
Doom 3 might have come close (and damn, that leaked Alpha was impressive) but by the time it was released it looked just slightly better than everything else.
Hmm I think GTA 3 , as an engine / open world environment was like a whoa moment from me. Then Modern warfare of course . Recently God Of War and Assasin Creed Odysseys rendition of Ancient Greece is quite spectacular.