I was recently lucky enough to buy an OLED monitor and it’s great. What is not so great is the amount of flickering I get in Gnome now when I have the experimental VRR setting enabled.

Now all OLED monitors have a certain amount of VRR flicker, but I am comparing it to my Windows duel boot and it’s absolutely terrible under Gnome, like just a noticeable increase in the amount of flicker under both games and the desktop versus Windows. The only way I get Windows to flicker as much on the desktop is if I turn on “dynamic refresh rate”, which kind of appears to be what Gnome is doing all the time. I can turn on the refresh rate panel on my monitor and Gnome can fluctuate all over the place, even on the desktop, whereas Windows is steady at max refresh (again one I turn off dynamic refresh rate, which is a separate setting then VRR).

For games the flicker is way worse using proton under Wayland (which GE supports). Hunt Showdown - which I play a lot, looks incredibly flickery when vsync and Wayland are turned on, it basically has a strobing effect.

Anyone else seen this in action? Any suggestions for a fix? Should I swap over to KDE for a bit until Gnome gets this straightened out or will Plasma have the same problems?

  • ErableEreinte@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    Setting a high refresh rate is somewhat of a given, but won’t negate anything which VRR helps with - screen tearing. If you’re always playing with VSync on and getting constant frame rates, that’s not an issue, but that’s also far from the usual experience.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      3 days ago

      Setting a high refresh rate is somewhat of a given, but won’t negate anything which VRR helps with - screen tearing.

      I mean, I’d just turn on vsync; that’s what it’s for. VRR is to let you push out a frame at the instant that it finishes rendering. The benefit of that declines as the monitor refresh rate rises, since there’s less delay until the next frame goes to the monitor.

      If you’re always playing with VSync on and getting constant frame rates, that’s not an issue

      looks blank

      Constant framerates? You’re saying that you get tearing with vsync on if whatever program you’re using can’t handle rendering at whatever the monitor’s refresh rate is? I mean, it shouldn’t.

      Running a static refresh rate with vsync will add a tiny bit of latency until the image shows up on the screen relative to VRR, but that’s a function of the refresh rate; that falls off as the refresh rate rises.

      • million@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        edit-2
        3 days ago

        https://www.reddit.com/r/XboxSeriesX/comments/t3fn6l/can_someone_explain_vrr_like_im_5_what_it_does/

        Ok, so let’s say your tv is a typical 60hz TV, that means it updates 60 times a second, regardless of the games frame rate. A 60fps game will be in perfect sync with your TV, as will 30fps because each frame will just be displayed twice. When your game is running at a frame rate in between it’s not in sync with the display any more and you end up with screen tearing, as the image being sent to the TV changes part way through the image being displayed.

        VRR stands for Variable a refresh Rate. It basically means the displays refresh rate can vary to match the source of the image, so that it always stays in sync.

        This a pretty good explanation of what VRR is doing. Basically makes it so you can drop frames and it still feels smooth.

        • squaresinger@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          Beware, what you are comparing vsync off with vrr.

          You have four options when it comes to screen refreshes:

          • Vsync off, VRR off: you get frames as fast as possible, no latency, but also tearing
          • Vsync on: the frame rate gets synchronised with the screen refresh rate. That means, sometimes the game will wait for the screen, leading to a lower frame rate (limited to the refresh rate of the screen) and slight latency, but no tearing
          • VRR: the game can lower (not raise) the refresh rate. Compared to Vsync maximum refresh rate it will lower power consumption and do nothing else
          • Triple buffering. Needs to be implemented by the game, not by the OS. Provides maximum frame rate and no tearing with minimal latency.
          • vividspecter@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Triple buffering. Needs to be implemented by the game, not by the OS. Provides maximum frame rate and no tearing with minimal latency.

            Vulkan mailbox mode is pretty much this and doesn’t require game support (can be forced on with environment variables if it’s not already being used). And since almost everything is Vulkan on Linux these days, one way or another, that covers most games (might be compatibility issues in rare cases).

        • tal@lemmy.today
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 days ago

          Right. What I’m saying is that the benefit that VRR provides falls off as monitor refresh rate increases. From your link:

          If a game on console doesn’t deliver new frame on time, two things can happen.

          Console can wait for a new TV frame, delaying display time about 16.7 ms (VSYNC). Which leads to an effect called stuttering and uneven frame pacing…

          If you have a 60 Hz display, the maximum amount of time that software can wait until a rendered frame goes to a static refresh rate screen is 1/60th of a second.

          But if you have a 240 Hz display, the maximum amount of time that software can wait until a rendered frame is sent to a static refresh rate screen is 1/240th of a second.

          OLED monitors have no meaningful framerate physical constraints from the LED elements on refresh rate; that traditionally comes from the LCD elements (well, I mean, you could have higher rates, but the LCD elements can only respond so quickly). If the controller and the display protocol can handle it, an OLED monitor can basically display at whatever rate you want. So OLED monitors out there tend to support pretty good refresh rates.

          Looking at Amazon, my first page of OLED monitor results has all capable of 240Hz or 480Hz, except for one at 140 Hz.

          That doesn’t mean that there is zero latency, but it’s getting pretty small.

          Doesn’t mean that there isn’t value to VRR, just that it declines as the refresh rate rises.

          Reason I bring it up is because I’d been looking at OLED monitors recently myself, and the VRR brightness issues with current OLED display controllers was one of the main concerns that I had (well, that and burn-in potential) and I’d decided that if I were going to get an OLED monitor before the display controller situation changes WRT VRR, I’d just run at a high static refresh rate.

          • vividspecter@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 days ago

            Just having VSync on can introduce frame pacing issues. It’s just not an issue if you can maintain the monitor refresh rate consistently, of course. And you can turn it off altogether if you can tolerate tearing.

            But that’s the main benefit of VRR for me which is frame pacing at sub monitor refresh rate, rather than latency reduction compared to the various types of VSync.