I definitely can, 4090 is in theory powerful but I have been having doubts around it's efficiency and software utilization, as most of my friends have been getting better results from both 1080's and 3070's with many modern games. I think there were some shortcuts taken at some point in development and there are a lot of bugs to be worked out.
I've also noticed this. I hear practically no complaints from people with 1080s or 3070s. Could be wrong, but I think I remember nvidia fucking with bus width a lot over the past few generations.
I would almost guarantee it's some sort of software efficiency and load distribution issue. A couple close family friends used to develop for Western Digital and their software was always developed well after the hardware and usually got underfunded and shortchanged in time, then developed by patches after product drop. Hardware is just an absolute bitch to develop for because you don't really know what it can and can't do really well until it comes to market and the big problems start cropping up. No small statistical sample can actually match the product run sample problems and other issues from other people's stiff interacting with your product.
I don't doubt that the 4090 will be a monster in a couple of years here, but I genuinely believe that they pulled a Cyberpunk on the hardware here and that is why so many people are complaining. They paid an ungodly sum of money for a graphics card that simply doesn't perform as well in the field as it should outside of testing software, so they are complaining about it en masse whenever those problems crop up with a specific game.