After using a Samsung QLED TV for a couple years, I started really noticing the downgrade when using my computer monitor. I haven't been able to convince myself to pay for any of the current OLED monitor offerings, but I'm eagerly awaiting a reasonable desktop-sized offering.
Depends which kind of OLED. LG WOLED has somewhat dirty whites and struggles to get full screen brightness. The JDI RGB OLEDs and SDI QD-OLED are quite good and very vibrant. The main issue with high density mini-LED displays are cost and complexity.I feel like the person talking about how much they like their PlayStation on an Xbox article, but I'm much more looking forward to Micro-QLED displays than I am OLED displays. I've never seen an OLED display that matched (high end) QLED's vibrancy, and the only downside to QLEDs is that they still use large sector back-lighting.
It is rather unfortunate that Apple is the only company that pushes the envelope regarding displays. We should have tons of 5K 27 inch PC monitors out there, but there just aren't.OLED is nice, but we still don't even have much more basic things:
We're more than a decade into the era of high-DPI PC displays and there are still almost no high-DPI options in the whole market with any panel technology.
Concretely, useful high-DPI at desktop display viewing distances generally means ~220 PPI or greater. This has long become common in laptop displays and high-quality all-in-ones like iMacs, but the number of standalone displays available in 2021 was 2 and has recently grown to 3 (all Mac-oriented, single Thunderbolt 3 input only, painfully priced from $1300 to $6000).
It helps more but it's no panacea. If you look at the LG B7/C7 and earlier generation TVs, they gradually develop a huge green blob in the middle of the screen as the red emitter degrades due to the center of the screen showing skin tones more often. Even spread out, the gradual differential aging becomes visible.Would pixel-shifting schemes not be a lot more effective if a manufacturer built say, a 2600x1480 "raw" panel that presented itself to the host as 2560x1440? Having 40 pixels of margin to work with to gradually shift the image around would do a lot to soften any burn-in from static elements, which combined with other techniques (dark themes, dimming, short power-off timers) would do a lot to alleviate burn-in.
Especially with OLED, you wouldn't even get any distracting "glow" from the extra margins; if you paid close attention the image miight look a bit off-center in the housing as it shifts around, but that's a price I'd gladly pay to reduce burn-in.
I imagine that Windows still not supporting high DPI displays well is part of the issue. I regret getting a 4k laptop screen every time a menu comes up illegibly small or inexplicably blurry.OLED is nice, but we still don't even have much more basic things:
We're more than a decade into the era of high-DPI PC displays and there are still almost no high-DPI options in the whole market with any panel technology.
Concretely, useful high-DPI at desktop display viewing distances generally means ~220 PPI or greater. This has long become common in laptop displays and high-quality all-in-ones like iMacs, but the number of standalone displays available in 2021 was 2 and has recently grown to 3 (all Mac-oriented, single Thunderbolt 3 input only, painfully priced from $1300 to $6000).
We really need an order of magnitude improvement in the emitter aging before OLED monitors can go mainstream.
There are a number of blue emitters on the horizon within the next 2-5 years that will more than triple efficiency, so I wouldn't count out one order of magnitude.At this point it looks like emitter aging only improved a small increment in the last decade, so I think order of magnitude improvements are a pipe dream. Most likely burn in will always be a fact of life with OLED.
I don't care about winning over the mainstream from LCD buyers. Get the price down of that new $999 LG to about half that, and IMO there will be plenty of buyers, myself included, that would be willing with live with the burn in risk (and mitigate for it), for the superlative image quality improvement.
One problem with this is +/- 20px from the center of an icon is likely to be the same color, not that there is any single magic bullet solution. (I'm mad now imagining a bunch of pixels I'm not allowed to use)Would pixel-shifting schemes not be a lot more effective if a manufacturer built say, a 2600x1480 "raw" panel that presented itself to the host as 2560x1440? Having 40 pixels of margin to work with to gradually shift the image around would do a lot to soften any burn-in from static elements, which combined with other techniques (dark themes, dimming, short power-off timers) would do a lot to alleviate burn-in.
Especially with OLED, you wouldn't even get any distracting "glow" from the extra margins; if you paid close attention the image miight look a bit off-center in the housing as it shifts around, but that's a price I'd gladly pay to reduce burn-in.
Windows supports high DPI screens just fine. It's crummy developers that don't play nice with Windows' scaling. It's painful that so many games don't take into account scaling given how it's become relatively rare to see titles have an actual exclusive fullscreen mode. Fullscreen these days typically means borderless windowed mode. Since all the features that used to require exclusive fullscreen also work these days in a borderless window.I imagine that Windows still not supporting high DPI displays well is part of the issue. I regret getting a 4k laptop screen every time a menu comes up illegibly small or inexplicably blurry.
Concretely, useful high-DPI at desktop display viewing distances generally means ~220 PPI or greater. This has long become common in laptop displays and high-quality all-in-ones like iMacs,
There are no MicroLED monitors, and there won't be for a VERY long time. Check back in 10 years.OLED will end up being replace with microLEDs fairly quickly in the desktop monitor segment since people tend to keep their monitors for a long time and the risk of burn is still a downside that at lot of people will not overlook. Brightness being the other but at least that can be controllable some what
...or push refresh rates that require serious GPUs on one hand.
Yes and no. macOS is perfectly capable to render the desktop at a higher 2x resolution and then downscale to whatever actual resolution your screen has. But yes, it does look marginally better without the downscaling.People tend to view bigger screens from a further distance, so less PPI is needed.
220 PPI is "retina" at about 16" viewing distance which may be reasonable when using a small laptop screen.
But a large desktop screen is more like to be viewed at 24"+ viewing distance making 32" 4K 138 PPI screen have the same effective density.
Apple aims for 5K screen because of a quirk in their scaling. Their high DPI mode doubles everything in size making that 4K 32" screen only display as much screen real estate as a 1080p screen. It's not a PPI issue, it's an awkward Apple scaling issue.
QLED isn't OLED. That's Samsung's deliberate marketing trick to confuse consumers.
I'm surprised this wasn't mentioned in the article (I guess because it's a "gaming" monitor), but Samsung just launched a monitor and a couple of TVs with OLED backlighting and quantum dot filters for the colors. This lets the OLED panel use larger, brighter pixels as backlighting, and let the dots handle the colors.I feel like the person talking about how much they like their PlayStation on an Xbox article, but I'm much more looking forward to Micro-QLED displays than I am OLED displays. I've never seen an OLED display that matched (high end) QLED's vibrancy, and the only downside to QLEDs is that they still use large sector back-lighting.
I'm surprised this wasn't mentioned in the article (I guess because it's a "gaming" monitor), but Samsung just launched a monitor and a couple of TVs with OLED backlighting and quantum dot filters for the colors. This lets the OLED panel use larger, brighter pixels as backlighting, and let the dots handle the colors.
This means the displays can be brighter than normal OLEDs while also having better color gamuts. It even costs less to produce.
I have a 48" LG CX that I've used as my monitor since March 2021 — a little more than 18 months. If/When it dies, I'll replace it with whatever the current-at-that-time model is.
I was concerned about burn-in, and was very careful for about the first six months. Since then, I haven't really given it a thought. At this moment, it's got 7,037 hours of usage... and not a hint of burn-in. I work from home as a video editor and UX designer... and portions of the screen are static for extremely long periods of time. I also play World of Warcraft a lot (yes, I'm a nerd) which also has a lot of static UI elements. So, I'm averaging more than 11 hours of use per day... and it's been absolutely perfect.
Isn't that's the qd oled that been mentioned in the article?I'm surprised this wasn't mentioned in the article (I guess because it's a "gaming" monitor), but Samsung just launched a monitor and a couple of TVs with OLED backlighting and quantum dot filters for the colors. This lets the OLED panel use larger, brighter pixels as backlighting, and let the dots handle the colors.
This means the displays can be brighter than normal OLEDs while also having better color gamuts. It even costs less to produce.
It's better nowadays; most games will scale their UIs, or at least offer an in-game scaling option. I'm running at 4k @ 150% DPI and haven't had too many problems, though some older stuff I do have to drop to 1080p because they don't scale at all.I imagine that Windows still not supporting high DPI displays well is part of the issue. I regret getting a 4k laptop screen every time a menu comes up illegibly small or inexplicably blurry.
Unfortunately instead of games I have the pleasure of working with Enterprise Grade Software.It's better nowadays; most games will scale their UIs, or at least offer an in-game scaling option. I'm running at 4k @ 150% DPI and haven't had too many problems, though some older stuff I do have to drop to 1080p because they don't scale at all.
A 3:2 high ppi display is even more of a unicorn. I'd settle for slightly lower ppi that makes text look seamless. I can't stand seeing pixels on text any more.OLED is nice, but we still don't even have much more basic things:
We're more than a decade into the era of high-DPI PC displays and there are still almost no high-DPI options in the whole market with any panel technology.
Concretely, useful high-DPI at desktop display viewing distances generally means ~220 PPI or greater. This has long become common in laptop displays and high-quality all-in-ones like iMacs, but the number of standalone displays available in 2021 was 2 and has recently grown to 3 (all Mac-oriented, single Thunderbolt 3 input only, painfully priced from $1300 to $6000).
As someone who just today worked on an iMac with the dock burned into it's LCD display, I can say I have zero interest in an OLED computer monitor regardless how good it looks.
Not sure if it would have ever scaled down to computer monitor size, but I still long for the alternate reality where SEDs actually became a reality. It seems like computer monitor sized micro-LED displays are almost as much of a pipe dream at this point.
What? Next you'll tell me my 0LED isn't the real thing!QLED isn't OLED. That's Samsung's deliberate marketing trick to confuse consumers.
Yes, it's very much related to brightness. Not everyone has it severely or notices it either.It's probably because I run them fairly dim, being very sensitive to brightness, but no I've never had burn in. My first Oled years ago had over 60k hours on it when I broke the screen zero burn in no green blobs or any other problems.
I use Windows on a 5k screen daily and would never want to go back to chunky pixels. Yes, there are some glitches, but most apps work fine and text is razor sharp. (And I'm still on Win10.)I imagine that Windows still not supporting high DPI displays well is part of the issue. I regret getting a 4k laptop screen every time a menu comes up illegibly small or inexplicably blurry.
I always thought 5K was targeted because Apple wants you to be able to edit 4K content and have room for tools around the video. At least, that's the reason I want a 5K display .People tend to view bigger screens from a further distance, so less PPI is needed.
220 PPI is "retina" at about 16" viewing distance which may be reasonable when using a small laptop screen.
But a large desktop screen is more like to be viewed at 24"+ viewing distance making 32" 4K 138 PPI screen have the same effective density.
Apple aims for 5K screen because of a quirk in their scaling. Their high DPI mode doubles everything in size making that 4K 32" screen only display as much screen real estate as a 1080p screen. It's not a PPI issue, it's an awkward Apple scaling issue.