I was recently lucky enough to buy an OLED monitor and it’s great. What is not so great is the amount of flickering I get in Gnome now when I have the experimental VRR setting enabled.

Now all OLED monitors have a certain amount of VRR flicker, but I am comparing it to my Windows duel boot and it’s absolutely terrible under Gnome, like just a noticeable increase in the amount of flicker under both games and the desktop versus Windows. The only way I get Windows to flicker as much on the desktop is if I turn on “dynamic refresh rate”, which kind of appears to be what Gnome is doing all the time. I can turn on the refresh rate panel on my monitor and Gnome can fluctuate all over the place, even on the desktop, whereas Windows is steady at max refresh (again one I turn off dynamic refresh rate, which is a separate setting then VRR).

For games the flicker is way worse using proton under Wayland (which GE supports). Hunt Showdown - which I play a lot, looks incredibly flickery when vsync and Wayland are turned on, it basically has a strobing effect.

Anyone else seen this in action? Any suggestions for a fix? Should I swap over to KDE for a bit until Gnome gets this straightened out or will Plasma have the same problems?

  • SnortsGarlicPowder@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    I have an IPS panel on KDE. I find I sometimes get flicker if HDR is on. Possibly it’s that? I also had this issue in Windows if I remember right.

    VRR on it’s own causes me no issues.

  • vividspecter@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 days ago

    Plasma wayland has an automatic mode which should at least turn off VRR during desktop usage, as long as the application window isn’t fullscreen. A hacky way that can help when VRR is active is to increase the minimum frequency and kick into LFC more readily, by creating a custom EDID (you can also do a sysfs edit historically but I think that’s AMD only).

    That’s only really viable if you have a high refresh rate monitor with a large range. I’ve found that 120hz may not be enough (since you may end up with gaps where VRR doesn’t work if the range is too narrow), which is of course the most common OLED TV refresh rate. In my experience a minimum of >=54hz minimises the flicker, but that may vary with the display.

    There’s also an issue with cursors where moving the mouse can make the refresh jump to the maximum. It only affects desktop usage and some games (RTS and the like, not usually FPS camera usage). There are fixes coming for this with Plasma I believe, but I’m not sure about Gnome. Forcing a software cursor may help, as others have indicated.

    • million@lemmy.world
      cake
      OP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Yeah I was exploring KDE on a Fedora live disc and I guessed that is what automatic vrr was doing. Turning it to always introduced more flicker but still seemed less then gnome.

    • million@lemmy.world
      cake
      OP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      It’s way worse if I run games under the experimental Wayland mode that you enable with GE.

      What distro are you using? I am on Bazzite

      • N.E.P.T.R@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I am using OpenSUSE Tumbleweed. I’ll try enabling Wayland using the environment variable for a game when I get on later.

  • ErableEreinte@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 days ago

    Yes, I’ve been bothered by VRR flicker on my OLED monitor (LG 27GR95QE) since I started actively gaming on it with my Linux build a couple months ago, it was never an issue with consoles for me.
    I’m on KDE FWIW, and the flicker is more pronounced during games with mouse cursor on screen afaict. I can’t compare to Windows.
    I think VRR flicker is less of an issue when running games within a gamescope session, but it’s not ideal either.

    • Willem@kutsuya.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      If the issue is more prominent when the cursor is showing, it could be the hardware cursor (default on KDE) causing the issue. When you use hardware cursors, the cursor is rendered on a different ‘plane’ on top of the rest, possibly causing desync. You could try disabling it with a environment variable (I think it was KWIN_FORCE_SW_CURSOR=1), forcing to software render the cursor.

    • million@lemmy.world
      cake
      OP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      2 days ago

      Kind of a bummer to hear - I was hoping KDE’s VRR implementation might avoid the issue. It may be a Wayland problem so that would be unavoidable.

      Edit: did some testing with a live image tonight - at least on my machine KDE seems much better when it comes to flicker

  • juipeltje@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    The fact that it’s that noticable, even in games, is unfortunate and i don’t know what would fix it. In my case i have a va panel that also flickers with vrr enabled, but i only notice it on the desktop. I only use window managers so i have keybinds to turn vrr on and off, which solves it for me. On windows it doesn’t flicker on the desktop though, so i’m assuming it does some stuff in the background where it only gets turned on when i boot up a game.

  • Mwa@thelemmy.club
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    Vrr works fine for me (VA panel) except for a extension where it conflicts(and that’s easy to fix)

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    edit-2
    3 days ago

    What is not so great is the amount of flickering I get in Gnome now when I have the experimental VRR setting enabled.

    The only way I get Windows to flicker as much on the desktop is if I turn on adaptive refresh rate, which kind of appears to be what Gnome is doing all the time.

    I don’t totally get what you’re trying to accomplish. If you don’t want VRR in the desktop environment, are you wanting VRR only to be active when a fullscreen game or movie player is running or something?

    EDIT: I’d also add that my understanding is that brightness fluctuation is kind of part and parcel with VRR on current OLED display controllers. I don’t think that it’s a fundamental limitation, that you could make a display controller that did a better job, but I’ve read articles matching up OLED monitors, and all of them that I’ve read about suffer from this. Like, if I got an OLED monitor today myself, I’d probably just set a high static refresh rate (which, fortunately, is something that OLED does do well).

    • ErableEreinte@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      3 days ago

      Setting a high refresh rate is somewhat of a given, but won’t negate anything which VRR helps with - screen tearing. If you’re always playing with VSync on and getting constant frame rates, that’s not an issue, but that’s also far from the usual experience.

      • tal@lemmy.today
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        3 days ago

        Setting a high refresh rate is somewhat of a given, but won’t negate anything which VRR helps with - screen tearing.

        I mean, I’d just turn on vsync; that’s what it’s for. VRR is to let you push out a frame at the instant that it finishes rendering. The benefit of that declines as the monitor refresh rate rises, since there’s less delay until the next frame goes to the monitor.

        If you’re always playing with VSync on and getting constant frame rates, that’s not an issue

        looks blank

        Constant framerates? You’re saying that you get tearing with vsync on if whatever program you’re using can’t handle rendering at whatever the monitor’s refresh rate is? I mean, it shouldn’t.

        Running a static refresh rate with vsync will add a tiny bit of latency until the image shows up on the screen relative to VRR, but that’s a function of the refresh rate; that falls off as the refresh rate rises.

        • million@lemmy.world
          cake
          OP
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          edit-2
          3 days ago

          https://www.reddit.com/r/XboxSeriesX/comments/t3fn6l/can_someone_explain_vrr_like_im_5_what_it_does/

          Ok, so let’s say your tv is a typical 60hz TV, that means it updates 60 times a second, regardless of the games frame rate. A 60fps game will be in perfect sync with your TV, as will 30fps because each frame will just be displayed twice. When your game is running at a frame rate in between it’s not in sync with the display any more and you end up with screen tearing, as the image being sent to the TV changes part way through the image being displayed.

          VRR stands for Variable a refresh Rate. It basically means the displays refresh rate can vary to match the source of the image, so that it always stays in sync.

          This a pretty good explanation of what VRR is doing. Basically makes it so you can drop frames and it still feels smooth.

          • squaresinger@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 days ago

            Beware, what you are comparing vsync off with vrr.

            You have four options when it comes to screen refreshes:

            • Vsync off, VRR off: you get frames as fast as possible, no latency, but also tearing
            • Vsync on: the frame rate gets synchronised with the screen refresh rate. That means, sometimes the game will wait for the screen, leading to a lower frame rate (limited to the refresh rate of the screen) and slight latency, but no tearing
            • VRR: the game can lower (not raise) the refresh rate. Compared to Vsync maximum refresh rate it will lower power consumption and do nothing else
            • Triple buffering. Needs to be implemented by the game, not by the OS. Provides maximum frame rate and no tearing with minimal latency.
            • vividspecter@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              Triple buffering. Needs to be implemented by the game, not by the OS. Provides maximum frame rate and no tearing with minimal latency.

              Vulkan mailbox mode is pretty much this and doesn’t require game support (can be forced on with environment variables if it’s not already being used). And since almost everything is Vulkan on Linux these days, one way or another, that covers most games (might be compatibility issues in rare cases).

          • tal@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 days ago

            Right. What I’m saying is that the benefit that VRR provides falls off as monitor refresh rate increases. From your link:

            If a game on console doesn’t deliver new frame on time, two things can happen.

            Console can wait for a new TV frame, delaying display time about 16.7 ms (VSYNC). Which leads to an effect called stuttering and uneven frame pacing…

            If you have a 60 Hz display, the maximum amount of time that software can wait until a rendered frame goes to a static refresh rate screen is 1/60th of a second.

            But if you have a 240 Hz display, the maximum amount of time that software can wait until a rendered frame is sent to a static refresh rate screen is 1/240th of a second.

            OLED monitors have no meaningful framerate physical constraints from the LED elements on refresh rate; that traditionally comes from the LCD elements (well, I mean, you could have higher rates, but the LCD elements can only respond so quickly). If the controller and the display protocol can handle it, an OLED monitor can basically display at whatever rate you want. So OLED monitors out there tend to support pretty good refresh rates.

            Looking at Amazon, my first page of OLED monitor results has all capable of 240Hz or 480Hz, except for one at 140 Hz.

            That doesn’t mean that there is zero latency, but it’s getting pretty small.

            Doesn’t mean that there isn’t value to VRR, just that it declines as the refresh rate rises.

            Reason I bring it up is because I’d been looking at OLED monitors recently myself, and the VRR brightness issues with current OLED display controllers was one of the main concerns that I had (well, that and burn-in potential) and I’d decided that if I were going to get an OLED monitor before the display controller situation changes WRT VRR, I’d just run at a high static refresh rate.

            • vividspecter@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 days ago

              Just having VSync on can introduce frame pacing issues. It’s just not an issue if you can maintain the monitor refresh rate consistently, of course. And you can turn it off altogether if you can tolerate tearing.

              But that’s the main benefit of VRR for me which is frame pacing at sub monitor refresh rate, rather than latency reduction compared to the various types of VSync.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      It does sound like there’s a way to ask GNOME to use VRR only when fullscreen stuff is running:

      https://www.reddit.com/r/linux_gaming/comments/1brlzqj/mouse_cursor_stuttering_on_gnome_with_vrr_enabled/

      I prefer Gnome+Dash2Dock- there I have set it to do VRR only in fullscreen apps (aka games).

      But the user doesn’t specify what he’s done to enable that setting, and I’m not familiar with GNOME’s (mutter’s?) Wayland settings. But if you are okay with VRR only with fullscreen apps, looking into that might address the issue.

    • million@lemmy.world
      cake
      OP
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      I had the name wrong initially - I just edited it to correct it, but under Windows “dynamic refresh rate” - is distinct then VRR. Settings reads “To help save power, Windows adjusts the refresh rate up the selected rate above”. See https://www.theverge.com/2021/6/29/22555295/microsoft-windows-11-dynamic-refresh-rate-laptops.

      I can turn it off and still have VRR enabled.

      Trust me when I say the amount of OLED flicker is much much higher in Gnome then under Windows for the exact same games. Like give you eye strain and a headache super fast. I still see a little flicker under Windows but it’s not comparable.

  • KOhBaby@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    3
    ·
    2 days ago

    VRR implementations in Linux are all terrible. I’ve just turned VRR off permanently and consider it the cost of using Linux.

      • KOhBaby@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        You may have a monitor that’s good at handling massive fluctuations in refresh rate without flickering. I do not and it seems like OP doesn’t either. Zero flickering under windows. Non stop flickering in i3, hyprland, and sway. Spent countless hours trying to debug it and eventually just gave up. Maybe devs have fixed it since then (doesn’t seem like it from the post) but a year or two ago the flickering was terrible.

        • Mwa@thelemmy.club
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          am Using DisplayPort implantation of VRR(48-180 Range) with Response TIme set to Faster, and Das Mode on, Under GNOME. so maybe thats why?