Skip Navigation
129 comments
  • You're not buying a triple A game anymore. You're buying the idea of the game they want to sell you, and hoping they deliver.

  • Increasing complexity, tighter deadlines, demand for highwr profit margins, decrease in education quality. Theres a lot of reasons and not all of them are necessarily bad. Its good that we can simulate what we can. I think the profit motive is just starting to show its ruinous powers as shareholders demand more and more.

  • Unfortunately, it's also here again with 2.0 so far. I started playing the game in 1.3, so this is the most buggy I've ever seen it. Vertex explosions, jumpy character animations, skills not working correctly, incorrect sound effects being played.

    This is indeed the new normal, and I shouldn't expect Phantom Liberty to run smoothly next week either. If took months after the recent big Witcher 3 update for it to play okay on mid-spec systems.

    I think I was happier when I still catching up on games from a couple generations ago. Now that I've done that, I keep running into this stuff. 😕

  • It was a very different experience for me. I had a blast playing this game when first released and didn't find any game breaking moment. This could be due to me playing on PC? So with the latest patch I loaded up my V and found noting of merit had changed. Seriously I found it to be the same game with small UI and Skill changes. I was shocked to say the least, due to the enormous patch size. I still haven't left training NC so I might find more changes but so far it's still an enjoyable game.

    • You had to be one of the rare lucky ones then.

      I played on PC and it was an absolute buggy, sometimes crashy, mess.

  • It’s weird - when I played at launch, I had precisely one bug that impacted my gameplay. Other than that, the game ran pretty smooth and was a joy to play.

    Now mind you, I was playing on a PC with a Xeon, 64GB of RAM, and an RTX 2080ti. Nothing ram badly on that system three years ago. Nowadays the older CPU, slower RAM and admittedly older GPU without all the newest bells and whistles (DLSS Framegen I’m looking at you) can’t quite measure up to the latest titles.

    Cyberpunk, at launch, was great. For me. Specifically for me. I loved it and still do. But this article hits a point for me that I’ve been struggling to find reason to write about without feeling like I’m ignoring people who primarily play on consoles or can’t afford a nice PC. Regardless…

    Man it fuckin’ sucks how you can spend a huge amount of money on a new GPU and then four months later a new one comes out that blows it out of the water. New hardware is so much better and - because all the game devs are using that hardware to design their games both on and for - systems like mine that are still fairly new can’t run the latest games at high settings anymore.

    It used to be that if you ponied up the money for a high-end rig, you could expect decent performance for years to come. But I guess blowing a grand on a GPU these days just means you’ll be doing it again in a year or something, instead of the decade or so before.

    I’m not saying my PC is bad. Most of what I play runs excellently. But when I spend a grand on just a GPU I expect that GPU to run the newest games at high settings for a long time. Jedi Survivor, Starfield, both run like crap on my system. Never mind the 2TB NVMe drive everything’s installed on.

    But I’m just bitching to bitch. Ignore me.

  • That's not a new thing though.

    I first learned the wisdom of waiting until after the bulk of the bug-squashing was done before expecting to play a reasonably stable game with Oblivion, 17 years ago.

    Granted that Cyberpunk 2077 was a particularly egregious example of the problem, but still...

129 comments