I set this up today on my work laptop with (internal) RTX3060. According to the status indicators on the Adjust Video Image Settings page in the Nvidia control panel, super resolution is working in Chrome v124.x and v125.x but not at all in Firefox v126.0. My eyes tell me the same thing. I was able to play a 480p YT stream in Chrome and it looked surprisingly good on my external 1440p monitor. In FF it looks like ass. I may set up a secondary profile in FF just to make sure I haven't changed some config setting over the years that would prevent it from working right in FF. Will update if I find anything interesting.
Edit: Just tried this again with YouTube in FF v126.0 with a clean profile. It does work, but only when the video is full screen (which makes sense I guess, but the behavior is different from Chrome) and I had to manually set the quality level in the Nvidia control panel. In Chrome the auto setting used level 4 (the highest level), but in FF the auto setting only used level 1.
Weird. I’m on desktop with an RTX 3080 and both super resolution and HDR are working just fine for me in both full screen and not. Results are actually quite good for me.
I think the default setting for auto depends on source resolution and desired display resolution from what I can see, so it’s variable depending on how and what you’re watching.
I mean how many Firefox users can even use this? Requires new gpu including compatible monitor.
Isn't that exactly why the hate?
Mozilla should focus on adding features everyone can use, not gimmicks from Nvidia that require you to buy their GPU and their approved monitors. Plus considering Nvidia's history with Linux which is a popular OS for Firefox...
AMD doesn't require that shit for, say, FreeSync or FSR.
Nvidia, wayland issues aside is still the superior card 9/10 times. This isn't a gimmick to get people to buy Nvidia, most of us will buy it anyway. During my last purchase as I pondered over whether I should get an amd and move over to wayland or an Nvidia card that would allow me to locally generate images of whatever I wanted. It was a pretty easy decision, I'll stick to X11 and Nvidia until the end. Stuff like this is just a bonus
Yeah, and the target isn't regular users, but developers. You need software to support it before it impacts users.
RTX will become more mainstream as time goes on, and software will use it more and more as that happens. The important thing is that developers have the tools they need and can provide feedback before RTX is completely mainstream.
The problem with AI upscaling is that it does add something. It fills in the details with things that could plausibly be there, regardless of if they are. It's especially dangerous if it's used for something like security footage, where it'll do stuff like make up a face based on a few pixels.