A prominent open-source dev publishes their findings as to what's going on with Starfield's performance, and it's pretty darn strange.
According to Hans-Kristian Arntzen, a prominent open-source developer working on Vkd3d, a DirectX 12 to Vulkan translation layer, Starfield is not interacting properly with graphics card drivers.
Sure but bundling the latest game with a gpu has been common practice by both green and red for a while. When the Egyptian assassins creed was coming out I remember seeing a card next to the GPUs at Microcenter saying you got it free with a qualifying purchase.
I took a gamble on the Arc 770 when I built my PC a few months ago, because I honestly am not too keen on the current GPU generation. Like why would I want to pay through my nose for cards that are incredibly power inefficient, with tendencies to catch fire to boot?
The Arc series offered decent performance (save for old DX9 games and such, but I already had a GTX970 I could use for those if need be), and shocking amounts of memory, so I gave it a shot and I'm really happy with it.
I have some weird graphical glitches in FFXIV from time to time. It's nothing overly annoying, sometimes a box will flicker on the screen for a frame, and sometimes the light fades out briefly. Other than that I've had no issues, it's chugging along really well. My biggest (and only) gripe with the card is the control centre software not allowing you to remap keybinds. That's pretty dumb.
All that said, I'm not a hardcore gamer by any means, I don't buy all the latest AAA games at launch (often not at all, really) and I don't care much for maxing out my graphics and running at 900FPS.