From improvements in the efficiency of OLED materials to software developments and new testing techniques, OLED burn-in risk has been lowered. OLED monitors are generally a more sound investment than ever—at least for the right person.

  • Wahots@pawb.social
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    10 months ago

    “If you’re a consumer planning to use an OLED monitor for gaming for two to three years, it’s a good choice. Beyond that, we don’t yet have enough real-world data to make a definitive judgment,” Karatsevidis said.

    I didn’t like the article that much, since it kinda rides on the fact that people are replacing monitors every three years, which most won’t do.

    Most people won’t turn on any non-default settings to mitigate wear. They’ll roll light mode, won’t turn down the brightness, won’t turn on savers, and will leave spotify on while the Taskbar is displayed. 5-8 years of use later, that will probably amount to uneven wear on the panel, making it more likely to go to a landfill rather than be sold secondhand for a new lease on life.

  • vzq@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    10 months ago

    I’m still using a monitor from 2010 on a daily basis. This consumerist throw away bullshit can go crawl back to the 20th century and die.

    • mild_deviation@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      CCFL-lit LCDs are so inefficient compared to modern LED-lit LCDs that you’ve probably spent enough more on electricity by now to have bought a more efficient monitor.

      I can’t speak to the environmental impact, though. Producing the new monitor emitted some amount of CO2, and powering each monitor takes some amount of CO2 per unit time. At some amount of use, the newer monitor will have lower lifetime CO2 generation than your old monitor.

      • turmacar@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        Not OP but my electicity is <$0.10 / kWh because of where I live. It seems like it would take much more than 13 years to hit the break even point on upgrading the monitor just because of energy efficiency.

        Even if the newer monitor has less of a lifetime environmental impact, throwing out the old still working one is still wasteful. It’s already made and working. Using it longer lessens your environmental impact. If you repair the old one when it eventually breaks, that’s still less of an impact than an extra ~20% electricity usage. Especially since electricity generation is getting greener all the time.

  • Send_me_nude_girls@feddit.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    10 months ago

    I’m not going to change my habits for a monitor. Hiding the taskbar is annoying, as Windows randomly has the habit of not showing it.

    Also there will be static elements on it for 16+ hours at least on the weekend. 8 to 13 under the week. Some buttons are bright some orange.

    Brightness can’t be lowered much as I don’t have many options to mitigate the sun unless I fully cover the window (bright reflection neighbor houses at different daytimes + normal sun + mirrors on walls etc.)

    What if I do a 48h gaming session? Can I throw it in the trash afterwards?

    • deur@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      10 months ago

      Could try to adapt your gaming sessions to include short breaks to help prevent injury, and grab a snack maybe. 10 minute breaks every hour (or few hours :) ) where you turn the monitor off may help?

    • milkjug@lemmy.wildfyre.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Same, it’s the biggest annoyance that’s putting me off an OLED at the moment. I don’t like the idea of having to baby my things and fretting over the small meaningless details with kids’ gloves.

      That and also because DP 2.1 still isn’t a thing in 2023 and only God knows why.

  • LoafyLemon@kbin.social
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    10 months ago

    My 2009 LCD panel still works perfectly and has been repurposed as a dining room TV. While it may not excel in reproducing black levels, it continues to function just as it did when I first purchased it. I am not going to bother with OLED if it means having to replace the screen every 2-3 years.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Burn in will always be a problem, you can’t get rid of it. Sure there are ways to minimize it and monitors can try to hide it, but eventually you will have a task bar, window borders, and desktop icons burned into the screen.

    • Encrypt-Keeper@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      That’s true, but at the same time LED TVs have a huge problem with bloom issues that are essentially a lottery because most manufacturers don’t consider it an actual defect and won’t replace it.

    • narc0tic_bird@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      It’s in the “nature” of OLED that it eventually wears down. My understanding is that technically, it’s not burning in, but burning out, and what’s perceived as burn-in is irregular wear of the different color channels or different brightness of the individual pixels (especially with HDR content).

  • revoopy@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I only read the headline but isn’t part of it WOLED. Using dedicated white subpixels reduces the workload of the other pixels

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    This is the best summary I could come up with:


    People tend to display static images on computer monitors more frequently than on TVs—things like icons, taskbars, and browser address bars—making burn-in risk a concern.

    “Industry chatter,” Dough co-founder Konstantinos Karatsevidis told me, showed that burn-in affected “around 5 percent of users” after two years.

    The latest models have improved materials and firmware that make them significantly more resistant to burn-in than they were years ago.

    Roland Wooster, chair of VESA’s Display Performance Metrics Task Group, told me that physical design changes have also helped.

    By counting the time each subpixel is displayed and at what brightness, a “wear level” can be determined for each pixel, using an algorithm to estimate the luminance degradation this can be compensated for.

    The companies that make monitors can implement a range of firmware, software, and hardware techniques to help fight burn-in.


    The original article contains 656 words, the summary contains 138 words. Saved 79%. I’m a bot and I’m open source!