"It’s a first-person, single-player game, you don’t necessarily need that 60 frames."
Absolutely bizarre that a 1st party title doesn't seem optimized for the console they're developing for. This makes me skeptical the PC version will be optimized too.
30, 60 or whatever fps is (or at least should be) a development decision made very early in development. It's only a case of poor optimization if it doesn't reach the target they've set.
I don't like it either, but an Unreal 5 game running at 30 fps (if that lol) on current gen is the norm.
In the interview they said how they show the game the way it is and focus on that part of development. They said how combat wasn't worked on yet when they showed the game, which now looks pretty reactive. They're going to focus on sound next and performance last, and when they said 30 it seemed like "bare minimum is solid 30". Given the feedback, there's a chance they'll try to incorporate 60 fps now.
While it's a design decision, UE is also a bit more scalable generally, assuming it's not all reliant on lumen, nanite and vsm.
Either ways, they need to learn from previous 30 FPS launches and try to communicate better. Saying it doesn't need 60 is dismissive to a large audience of gamers who don't like the trade-off of frames over image quality.
No wonder consoles are just not as appealing anymore.
We used to get systems, that were purposefully designed to only play games, but do it phenomenally well. That shit absolutely defined an entire generation of gaming.
Now we get a crippled PC, with dorito ads on the dashboard
Eh... Consoles used to be horribly crippled compared to a dedicated gaming PC of similar era, but people were more lenient about it because TVs were low-res and the hardware was vastly cheaper. Do you remember Perfect Dark multiplayer on N64, for instance? I do, and it was a slideshow -- didn't stop the game from being lauded as the apex of console shooters at the time. I remember Xbox 360 flagship titles upscaling from sub-720p resolutions in order to maintain a consistent 30fps.
The console model has always been cheap hardware masked by lenient output resolutions and a less discerning player base. Only in the era of 4K televisions and ubiquitous crossplay with PC has that become a problem.
It's playable and you can enjoy the game, but 30FPS is embarrassing. It makes me feel like I'm a kid playing on a PC assembled out of old leftover components. Which was tolerable when I was a cashless kid playing pirated games on inherited frankenPCs, but it feels so wrong when playing a bought game on its intended spec hardware.
I'd say 60+fps is especially necessary for first-person games. I seriously have issues making out objects and other things when looking around first-person at 30fps.
say that the Xbox isn’t powerful enough to run it at anything beyond that.
There's no way they can't just lower the resolution and apply upscaling like every other game that has a quality and performance"mode. They're intentionally locking it to 30 for some bizarre reason.
"It's 4K in the X. It's 1440 on the S. We do lock it at 30, because we want that fidelity, we want all that stuff. We don't want to sacrifice any of it."
I might hope it's not because of the same reason Bethesda locked their framerates, because their entire game's physics and other stuff would break when you unlocked it. I assume it's not, if it's only locked on Xbox, which then would mean that the console is just weak.
I mean.. 30fps has been the single-player console experience for as long as I can remember. (Except for the PS4/XboxOne-native games -- seemingly this entire generation -- which get 60fps on current gen.)
Yes, PC can do 60fps+ if your rig is beefy enough. Yay.
Console wars bullshit is insufferable. Even when PC is one of the consoles.
Yeah but on PC you usually get graphics settings you can tune to whatever you like. I'd personally rather have a slightly worse looking game running at 60+fps, than a beautiful one at 30.
Just because 30FPS has been a standard on consoles for so long doesn't mean it should stop there.
There's no reason to not advance if they got the opportunity to do so, the entire gaming industry benefits from it.
Xbox is just not capable of handling the game at higher framerates, that has nothing to do with console wars or whatever, it's just the limitation of the hardware and it being an underwhelming console in general.
I'm aware that the PS5 is low on "exclusives". A big part of the reason I got it was for simple things like being able to run old PS4 games at higher framerates.
We're past the diminishing returns on visuals for games; not to say games can't look ugly, but with a decent art direction, the capabilities of current consoles are more than enough. That's why Nintendo was still able to sell Tears of the Kingdom for $60.
True - I think I meant to refer to it more as a generational issue; many people haven't upgraded to either current-gen console yet because they don't technically need them. PS5 might have few exclusives, but Xbox has basically none. (Many of their heavy-hitters like Sea of Thieves still run on Xbox One)
I think people give too much importance to such things.
I’m not saying 60fps isn’t nice, but it ain’t the most important thing and I feel like drawing distance or stuff like this are more distracting.
For me it’s crazy to still have racing games where shadows or trees are still appearing too late or where things are just abnormally disappearing in the rear view mirror.
I generally agree, but it should still be trivially easy to ship the game with at least two options, as most ps5 games do. one high fidelity at 30fps and one high performance at ~60fps.
The screenshots in the IGN page for the game itself look like something last gen. What the fuck? It's not even up to Stray/CP2077 fidelity and those run on my fucking Steamdeck...