• ipkpjersi@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 day ago

    HDR on a 300 nits monitor is like trying to watch a movie with your eyes mostly closed lol

    I dunno why 300 nits monitors advertise HDR, except I do, it’s an extra “selling” feature despite being useless at that brightness.

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      VESA honestly should ban manufacturers from having anything below HDR 500 certified as HDR, as the specification is so watered down with HDR 400 and below (sRGB? Really? Why not WCG like all the other standards 500 and up?). HDR on a 300 nit display is terrible, and manufacturers should be embarrassed to sell those as HDR.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      Have you seen videos or pictures that have dark sections, and there’s “banding” where there’s a noticeable difference between something black and something very black? Like a sharp border where it’s obvious the conversion process from the camera to your screen didn’t fully capture a gradient of darkness?

      That’s due to the process not being able to handle darker areas compared to very bright areas. It’s not enough to have an HDR display; the whole chain before then has to support it, as well. When it’s done, not only does it get rid of banding, but finer elements in darker areas can pop out and join the rest of the scene.

    • kieron115@startrek.website
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      If you have a theater nearby that offers Dolby Vision films you can try out a version of HDR. They use laser projectors so the blacks can really be pure black. When the screen goes dark just before the movie the entire theater will be pitch black except for emergency lighting. It’s glorious.

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      In a nutshell, it essentially increases the range of brightness values (luminance/gamma to be specific) that can be sent to a display. This allows content to both be brighter, and to display colours more accurately as there are far more brightness levels that can be depicted. This means content can look more lifelike, or have more “pop” by having certain elements be brighter than others. There’s more too, and it’s up to the game/movie/device as to what it should do with all this extra information it can send to the display. This is especially noticeable on an OLED or QD OLED display, since they can individually dim or brighten every pixel. Nits in this context refers to the brightness of the display - 1000 nits is far brighter than most conventional displays (which are usually in the 300-500 range).

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      I think that’s mostly because it’s (a) 60 fps, and (b) they use a lot of pure black to really bring out the contrast on OLED screens.

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      If you ever get the opportunity, try out HDR ITM tone mapping (essentially a HDR upconversion thing you can do with Gamescope on Linux) playing Persona 3 Reload on a QD OLED monitor (for that extra brightness) in a dark room. Even though it’s not even a native HDR game, with ITM it looks so good, especially because it’s a game with a lot of dark graphics mixed in with super bright. The text pops, and combat is next-level.

        • heythatsprettygood@feddit.ukOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          I haven’t experienced issues with oranges on my setup (AW3423DWF, 7900 XTX). Perhaps it is to do with your hardware?

          • Semperverus@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            It happens on several monitors and my TV, and it happens with both my desktop and my steam deck, even with rhe HDR saturation set to “SDR.” It’s like the red channel gets crushed upwards.

            Maybe its a configuration issue on my part? Or maybe its the panel brand? I do have a lot of LG screens, but then you’d think it wouldn’t be an issue elsewhere either…

            Any ideas are welcome though, hoping to fix it so the family and I can start enjoying HDR more.

            • heythatsprettygood@feddit.ukOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 day ago

              Are you using tone mapping through the Steam UI (I think the Deck has its own controls for HDR inverse tone mapping) or through the command line options you can use for games? If you are using the UI, it might be worth using the command line toggles instead as maybe the UI is setting some wrong settings. If it helps, here is the set of command line options I use on my system (modify brightness, refresh rate, and resolution to fit your display) DXVK_HDR=1 ENABLE_HDR_WSI=1 gamescope -f -r 165 -W 3440 -H 1440 --adaptive-sync --hdr-enabled --hdr-itm-enable --hdr-itm-sdr-nits 350 --hdr-sdr-content-nits 800 --hdr-itm-target-nits 1000 gamemoderun -- %command%. In addition, it might be worth looking through the display settings to see if it’s in any sort of colour boosting HDR modes - my Alienware had to be set to “HDR Peak 1000” for colours to look as they should, as by default it messes around with things a bit. If you can as well, try some other devices that can output HDR (like a game console or Blu Ray player or something) to see if it’s making those outputs look a bit red too - if so, it’s to do with the display, and if not it’s a configuration issue.

              • Semperverus@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                I have not started using the launch flags yet, I’ll have to give those a try. Wonder if it’s possible to set those nits values globally per display in a config file somewhere?

                • heythatsprettygood@feddit.ukOP
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  As far as I know, no. I guess it’s not exactly a good idea globally, as some games sometimes need some changes. For example there’s one or two that don’t like ITM and will have display corruption (at least last time I tested, possibly fixed now), and I have to use some extra flags to get TF2’s mouse controls working in Gamescope.

      • WolfLink@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        2 days ago

        I don’t know if I want my text to be that bright TBH.

        I have a 1000 nits monitor and IMO it looks best with things like stars in a night sky or lights in a dark cave.

        When it gets too much of the screen bright at once it hurts my eyes. Sometimes that happens when playing Destiny 2, which generally has a pretty good HDR implementation, but when too many explosions happen at once it gets overwhelming.

    • dustyData@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      2 days ago

      Really good 4K+ scans of film made with HDR in mind can look even better than that. That video demo has some amateurish lighting here and there. Hollywood level professional lighting and production looks extraordinary if tone mapping was properly done during digital scan. Right now there’s a lot of bad quality HDR and sketchy OLED panels around, though.

  • _cryptagion [he/him]@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    2 days ago

    I play games at night, there’s no way I’m using a monitor at 1000 nits. My screen only goes up to 500 nits, and I still usually have it set to 50 nits if the game is really dark.

  • ShinkanTrain@lemmy.ml
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 days ago

    1000 nits is actually not that much, a full moon is 2500 nits. A lot of movies are mastered at 4000 or 10000 nits

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      26
      ·
      2 days ago

      In a small room where it’s the only light source, it’s still a crazy amount of light. My eyes genuinely had to get used to the brightness for a couple minutes after I set it up for the first time, and the walls sometimes looked like the ceiling light was on.

  • afk_strats@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    2 days ago

    I couldn’t get Nvida to work reliably with HDR. It would run, but I’d always get a crash after a bit.

    Im on Bazzite with a 3080ti GE Proton

    • heythatsprettygood@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      What sort of system are you on, and what have you been trying? The best setup is with an AMD GPU and a more up to date distro (Fedora, Arch, so on). I can give some help if you need.

    • Synapse@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      For me it works great only when watching a HDR video on mpv and everything else looks washed out and pale. Including the entirety of Firefox and its content.

  • Oniononon@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    6
    ·
    2 days ago

    Never had hdr, don’t know how to miss it. Must be like dolby atmos or 4k where its a scam from hardware manufacturers to sell you shit that does nothing but make you buy more.