• sgibson5150@slrpnk.net
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    7 months ago

    I set this up today on my work laptop with (internal) RTX3060. According to the status indicators on the Adjust Video Image Settings page in the Nvidia control panel, super resolution is working in Chrome v124.x and v125.x but not at all in Firefox v126.0. My eyes tell me the same thing. I was able to play a 480p YT stream in Chrome and it looked surprisingly good on my external 1440p monitor. In FF it looks like ass. I may set up a secondary profile in FF just to make sure I haven’t changed some config setting over the years that would prevent it from working right in FF. Will update if I find anything interesting.

    Edit: Just tried this again with YouTube in FF v126.0 with a clean profile. It does work, but only when the video is full screen (which makes sense I guess, but the behavior is different from Chrome) and I had to manually set the quality level in the Nvidia control panel. In Chrome the auto setting used level 4 (the highest level), but in FF the auto setting only used level 1.

    • geekwithsoul@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 months ago

      Weird. I’m on desktop with an RTX 3080 and both super resolution and HDR are working just fine for me in both full screen and not. Results are actually quite good for me.

      I think the default setting for auto depends on source resolution and desired display resolution from what I can see, so it’s variable depending on how and what you’re watching.

      You on Windows 10 or 11?

      • sgibson5150@slrpnk.net
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        Sorry. Should have mentioned. OS is Windows 11 Pro 23H2 22631.3593. Also, video driver is Nvidia Game Ready Driver 552.44.

        • geekwithsoul@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          Interesting - I’m running the same driver version but on latest version of Windows 10 Pro. In FF, under about:config, is gfx.webrender.enabled or gfx.webrender.all set to true? If not, that might be part of it.

          • sgibson5150@slrpnk.net
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            7 months ago

            On the new clean profile I created in v126.0, I didn’t have a gfx.webrender.enabled and gfx.webrender.all was set to false. Changing gfx.webrender.all to true didn’t really change the behavior. Nvidia control panel only shows super resolution active when full screen. Watching the same test video as yesterday at the same requested resolution. I did notice that if I set the Quality back to auto, with gfx.webrender.all = true, it picked 2 today instead of 1. 🤷‍♂️

            Edit: One DDG search later https://support.mozilla.org/en-US/questions/1445419

      • taladar@sh.itjust.works
        link
        fedilink
        arrow-up
        19
        arrow-down
        2
        ·
        7 months ago

        One of the last browsers out of the two that exist (ignoring those that don’t really develop any of those features themselves)?

          • taladar@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            7 months ago

            Safari is also just one of the forks of the KHTML/WebKit/Blink codebase Chrome is based on. Admittedly they probably implement some of the stuff they do implement themselves too because the common ancestor version is quite a long time ago now.

            • Turun@feddit.de
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              They don’t incorporate chromium changes in safari, so it should be considered separate.

      • frankgrimeszz@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        2
        ·
        7 months ago

        It’s a vendor specific feature, as opposed to something any graphics chip can use. It’s kinda like… endorsing a closed source driver feature.

        • Carighan Maconar@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          Meh, with games we want them to work independent of which type of controller we use, but display each driver’s specific button graphics as needed. I see no difference here. Do I want dynamic upscaling and auto-HDR for all graphics cards? Sure! Do I still want it optimized for each type of graphics card unless the hardware makers can - unlikely - present a unified API? Of course I do.

    • sgibson5150@slrpnk.net
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      From my testing today I found that this actually works pretty well (though not in FF haha). See my top level comment this thread.

    • underscores@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      7 months ago

      The problem with AI upscaling is that it does add something. It fills in the details with things that could plausibly be there, regardless of if they are. It’s especially dangerous if it’s used for something like security footage, where it’ll do stuff like make up a face based on a few pixels.

        • underscores@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          7 months ago

          Depending on the context it’s probably not that bad, but there’s plenty of details in youtube videos that people pay attention to, like in news, history, tutorials, educational content, and so on. Even for a fictional story, it could add nonsense that people assume is part of the actual show.

          • Ferk@programming.dev
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            7 months ago

            If the original footage is so bad that “nonsense that people assume is part of the actual show” “could plausibly be there”, then the problem is not with the AI… it wouldn’t be the first time I’m confused by the artifacts in a low quality video.

          • lud@lemm.ee
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            At worse upscaling will make the video worse. It won’t add aliens or some shit to your videos.

            • underscores@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              7 months ago

              With AI upscaling it fills it in based on the training from other images/videos. So it probably won’t be an alien, but small details common in other videos that looked similar will also show up in the upscaled videos. If an extra flower shows up in a field of grass it’s usually not a big deal, but for some things like faces or symbols, small details can really change the way people interpret it.

              • lud@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                7 months ago

                I highly doubt the upscaling could do anything close to adding faces or something.

                The best it could do with a blurry face like thing is probably just to make it look sharper.

  • Johanno@feddit.de
    link
    fedilink
    arrow-up
    10
    arrow-down
    8
    ·
    7 months ago

    I don’t get the hate.

    I mean how many Firefox users can even use this? Requires new gpu including compatible monitor.

    And where is the usecase? Videos in browser? Those are usually chopped down anyway. Upscaling will not help there.

    So it’s a cool feature for the 10 people who can use it.

    • Midnitte@beehaw.org
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      2
      ·
      7 months ago

      I mean how many Firefox users can even use this? Requires new gpu including compatible monitor.

      Isn’t that exactly why the hate?

      Mozilla should focus on adding features everyone can use, not gimmicks from Nvidia that require you to buy their GPU and their approved monitors. Plus considering Nvidia’s history with Linux which is a popular OS for Firefox…

      AMD doesn’t require that shit for, say, FreeSync or FSR.

      • PolarisFx@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Nvidia, wayland issues aside is still the superior card 9/10 times. This isn’t a gimmick to get people to buy Nvidia, most of us will buy it anyway. During my last purchase as I pondered over whether I should get an amd and move over to wayland or an Nvidia card that would allow me to locally generate images of whatever I wanted. It was a pretty easy decision, I’ll stick to X11 and Nvidia until the end. Stuff like this is just a bonus

    • swayevenly@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 months ago

      You think only 10 people have an RTX GPU?

      Also super resolution and HDR are separate checkboxes.

    • breakingcups@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      7 months ago

      Well, you’ve pointed out one of the issues. Was it really worth dedicating an engineer’s limited time to?