I’m so done with win11, and currently 12 of my 15 machines are linux anyway, but AFAIK HDR (on nvidia gpu) is still impossible? Are you guys all on AMD or just not using hdr for gaming/media? So instead of relying on outdated info, just asking the pros :)

  • WolfLink@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 hour ago

    I’ve got HDR working with an Nvidia card on bazzite but the current workaround means I can’t use HDR and Steam Input at the same time. This is using the gnome variant. I think the situation may be better with kde.

    • mmus@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 hour ago

      The AMD part is actually the opposite, since AMD drivers on Linux can’t do HDMI 2.1, but NVidia can. Thats not quite true, you can do HDR with HDMI 2.0 but you will be limited to 8bits per channel, of which will exhibit pronounced chroma banding, specially noticible in skies gradients.

      HDMI 2.0 can also support 4k 120hz but it will be limited to 4:2:2 chroma subsampling. It’s fine for the typical TV viewing distance and 2x hidpi scaling but sucks for desktop usage, specially at no hidpi scaling.

      You can also get a DP 1.4 to HDMI 2.1 adapter and get full 10bit color and 4:4:4 chroma 4k@120hz no problem. The trouble is usually VRR, which tends to be very finicky or not work at al… :(

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      0
      ·
      2 hours ago

      Would’ve prefered a debian-based, but that surely is a point for cachy. Plus it’s german/eu. Does it do multiple monitors stress-free too? Thanks!

      • pivot_root@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 hours ago

        Does it do multiple monitors stress-free too? Thanks!

        I’m not the same guy you were talking to, but if you use Wayland, multi-monitor should work without any issues.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    4 hours ago

    I’m on a Sony OLED with a 3090. I game some, and color grade photos/videos in HDR.

    …And I can’t get HDR to look right in KDE, even with the display running off my AMD IGP. It has basically zero options for me to tweak.

    So I use Windows for that.


    Honestly, it’s hard enough on Windows. It’s a coin flip as to whether apps works or not. Many games, specifically, need mods to look right.

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      1
      ·
      4 hours ago

      Damn. Nah, as long as i’d still need windows, I see totally no benefit in dual-booting. I could live with a VM for the banking stuff or so, but dualbooting. Meh :( And yes, it’s already sucky enough on win. Though win11 made it better.

      Thanks for your reply!

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        Dual booting is not bad!

        What I do is share an NTFS partition between Windows and Linux. If they’re DRM free, you can literally run the same games off the same drive.

        I also use two EFI partitions (the default Windows one and a new one for Linux) so there is zero possiblity of the OSes interacting.

        To be blunt, I would never do banking in Windows if you can do linux. It’s just too much of a risk.

        • Dyskolos@lemmy.zipOP
          link
          fedilink
          arrow-up
          1
          ·
          3 hours ago

          I use macrium reflect (which i would deerly miss on linux), so any mistake is just seconds away, and a complete restore in mere minutes. But as long as i HAVE to use win for hdr in games/media, i do have to use win. so dual-booting saves none of the risk unless win goes into a vm. wouldn’t even need another one, my domain controllers (and dns and such) are already win-vms.

  • artyom@piefed.social
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    3
    ·
    6 hours ago

    We’re definitely not all on AMD but most of us are.

    Personally I don’t understand what all the hubbub is about HDR anyway. It always makes the picture look all washed out and desaturated so I keep it off. I’m obviously missing something.

    • WolfLink@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 hour ago

      It always makes the picture look all washed out and desaturated

      This is a typical symptom when part of the HDR pipeline is working but not all of it. The HDR image is getting (poorly) converted to SDR before it’s being displayed.

      Actual HDR is richer colors than SDR. Note that you basically need an OLED monitor to display it properly. On most LCD monitors that advertise “HDR” support, it won’t look very different.

        • WolfLink@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          1 hour ago

          Then you definitely have some settings wrong.

          Make sure the monitor is set to HDR mode, and the OS is set to treat the monitor as HDR. Depending on the OS there may be other things to play with. E.g. I was getting the issue with things looking washed out after the latest bazzite update until I manually installed VK_HDR_LAYER

          Here is a site I usually use to test that HDR is working correctly: https://www.wide-gamut.com/test

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 hours ago

      Completely disagree.

      Even setting gaming aside, I’ve started taking family/fun photos in HDR instead of JPEG, and on things that can render them (like smartphones and TVs), they are gorgeous. I can’t imagine going back now.

      I took this on a walk this week, completely unedited:

      HDR HEIF

      If your browser doesn’t render that, here’s my best attempt at an AVIF conversion:

      HDR AVIF

      And JPEG-XL:

      HDR JXL

      On my iPhone or a TV, the sun is so bright it makes you squint, and as yellow-orange as real life. The bridge in shadow is dark but clear. It looks just like I’m standing there, with my eyes adjusting to different parts of the picture.

      I love this! It feels like a cold, cozy memory.

      Now if I crush it down to an SDR JPEG:

      It just doesn’t* look* right. The sun is a paper-white blob, and washed out. And while this is technically not the fault of encoding it as SDR, simply being an 8 bit JPEG crushed all the shadows into blocky grey blobs.


      …This is the kicker with HDR. It’s not that it doesn’t look incredible. But the software/display support is just not there.

      I bet most browsers viewing this post aren’t rendering those images right, if at all.

      Lemmy, a brand new platform, doesn’t even support JXL, HEIF, or AVIF! It doesn’t support any HDR format at all; I had to embed them from catbox.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 hours ago

        Here’s an interesting test:

        Say what you will about Safari and iOS, but it rocks with image format support. HDR JPEG XL and AVIF render correctly, and look like the original HEIF file from the camera.

        Helium (a Chrome fork) is on the left, Firefox on the right, running CachyOS Linux with KDE on a TV, HDR enabled from AMD output.

        Firefox fails miserably :(

        Chrome sorta gets the AVIF right, though it seems to lose some dynamic range with the sun.

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      6 hours ago

      Yes, you seem to be missing actual HDR :-) It looks washed out and desaturated if you’d view SDR content while HDR is enabled. Or the monitor can’t. Or whatever else. I even have problems with jellyfin on windows to get it right. That things needs a separate app to actually work. So HDR’s the only good thing about win11, as it mostly works.

      I really wanna finally ditch that horrorshow, but going back to SDR feels like going back from 4K to 480p.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 hours ago

        I suggest you strip Windows to the bone (including Defender), dual boot Linux, and delegate Windows as a “HDR media OS”

        This is what I do, and it works well. Sufficiently neutered, Windows if really quick and out of the way.

        • Dyskolos@lemmy.zipOP
          link
          fedilink
          arrow-up
          1
          ·
          3 hours ago

          It’s already neutered, but dual-booting really isn’t an option. As long as win remains a bootable option, why even add another one, i see no benefit in running both and wasting time switching regularly. Soon i wouldn’t even switch and ditch HDR :)

          • brucethemoose@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            3 hours ago

            I don’t really see the logic to that, as switching is near effortless. It takes a couple of seconds to reboot and select the other OS. Ditching HDR, on the other hand, is painful.

            Each to their own though.

            • Dyskolos@lemmy.zipOP
              link
              fedilink
              arrow-up
              0
              ·
              2 hours ago

              I want to leave win, if i’d still have to keep it (and not as a very specific vm), why goes through the hassles (everything is already working now) of booting anything else? win is still there and still has to be maintained. I would not gain anything but double the work.

              • brucethemoose@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                2 hours ago

                You’d gain HDR!

                Windows is nearly effortless to maintain if you only use it for entertainment.

                Maybe we just have different priorities, but right now, I’d be miserable if I was stuck on Linux only, even though I use Linux like 90% of the time. Most media and some games just don’t look right.

                • Dyskolos@lemmy.zipOP
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 hours ago

                  Err, I already main windows (for my rig that is, not the servers). I have HDR and everything else, i would not gain anything i have not right now with dualbooting linux. Except the illusion I have ditched windows while being on linux :)

                  Everything else i could somehow cope with the loss, replace or code from scratch for linux, except HDR :(

    • Overwrite7445@lemmy.ca
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      6 hours ago

      Washed out and desaturated is the opposite of what it should do. Sounds like you may be looking at SDR (non HDR) content with HDR enabled?

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        6 hours ago

        Is Linux not able to switch HDR on and off as necessary?

        I usually do this in-game so why would a non-HDR game have an HDR toggle?

        • moody@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          2 hours ago

          You may have to enable HDR for Linux on each monitor individually from the display settings, and then enable HDR for the game itself from within its own settings.

          • artyom@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 hours ago

            I only have 1 (large) monitor. I toggle it on and everything looks like butt. Then toggle it off and it looks normal again. It’s an HDR OLED display.

  • MentalEdge@sopuli.xyz
    link
    fedilink
    arrow-up
    2
    ·
    5 hours ago

    My screen only does HDR600, but it does work.

    It looks a little nicer than with it off, so I do keep it on. SDR content does not suffer.

    I’m on KDE wayland with an AMD GPU.

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      1
      ·
      5 hours ago

      Sounds great, besides the AMD part. So it might work, but not with my nvidia :( Thanks!

        • Dyskolos@lemmy.zipOP
          link
          fedilink
          arrow-up
          2
          ·
          4 hours ago

          Nah, I don’t wanna spend more time fiddling than actually doing something, so no arch for me :) Although cachyos is based on arch IIRC. either way, good to know!

          Thanks!

          • MentalEdge@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            4 hours ago

            It is.

            If you’re on cachy you should be able apply any relevant changes same as arch.

            But if I’m reading the wiki right, you should already be good to go, provided you’re on a DE that supports HDR.

  • Zikeji@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 hours ago

    I have the same issue with HDR on Linux I had on Windows - support. But, on KDE Plasma it works well enough that I usually forget it’s working. Before I switched to KDE Plasma though I was on Gnome and it was a tad more difficult.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 hours ago

    15 machines? Do you use a rack? Seriously though much like the other responder hdr and actually most video things are not important to me and I tend to adopt them only when they are on the most affordable of options. Granted there are certain things important to me that I will go for earlier. Energy efficiency or more environmental disposal wise (have not yet encountered anything that significantly is but it would sway me). Also I have wanted to get organic leds because having very black blacks is a big deal for me. So basically I guess im saying I mostly don’t care about that stuff.

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      5 physical, rest VMs. I’m not crazy :-) Wasn’t important to me either until i just switched because the new monitor could. Now i wouldn’t wanna go back as it would feel like from 4K=>480p. Couldn’t care less about energy-efficiency though, i already waste an 8-person-average (according to my provider) for my hobbies alone :)

      • HubertManne@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        Im so bad I get annoyed when I can’t stream things at standard def and have to do at least 720 as I want to limit the bandwidth im using.

  • Björn@swg-empire.de
    link
    fedilink
    arrow-up
    2
    ·
    6 hours ago

    I do know that Valve have been working closely with KDE to get it working there. So you should check if you’re on the latest Plasma desktop and you probably need Wayland.

    Beyond that I have no idea. I don’t have any HDR capable device.

    • Dyskolos@lemmy.zipOP
      link
      fedilink
      arrow-up
      2
      ·
      5 hours ago

      With no device it sure won’t phase you at all :) I do really regret having made the hdr-switch back then.