Both Windows and Linux (Wayland) support scaling the UI itself, and with their support for sub-pixel anti-aliasing (that macOS also lacks) this makes text look a lot more crisp.
Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.
And then Windows has serious problems with old apps - blurry as hell with a high DPI display.
Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.
[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.
Fonts on Linux (KDE Plasma on Wayland) look noticeably sharper than the Mac. I don't use subpixel rendering either. I hate that I have to use the Mac for work.
This would've been easily solved with non-integer scaling, if Apple had implemented that.
(I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)
Some indie Mac developers even started implementing support for it in anticipation of it being officially enabled. The code was present in 10.4 through 10.6 and possibly later, although not enabled by default. Apple gave up on the idea sadly and integer scaling is where we are.
Here’s a developer blog from 2006 playing with it:
> https://redsweater.com/blog/223/resolution-independent-fever
There was even documentation for getting ready to support resolution independence on Apple’s developer portal at one stage, but I sadly can’t find it today.
Here’s a news post from all the way back in 2004 discussing the in development feature in Mac OS tiger:
> https://forums.appleinsider.com/discussion/45544/mac-os-x-ti...
Lots of of folks (myself included!) in the Mac software world were really excited for it back then. It would have permitted you to scale the UI to totally arbitrary sizes while maintaining sharpness etc.
> (I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)
The TV is likely a healthier distance to keep your eyes focused on all day regardless, but were glasses not an option?
Once you get used to flicking in and out of zoom instead of leaning into the monitor it’s great.
As an aside, Windows and Linux share this property too nowadays. Using the screen magnifiers is equally pleasant on any of these OSes. I game on Linux these days and the magnifier there even works within games.
I use a Mac with a monitor with these specs (a Dell of some kind, I don't know the model number off the top of my head), at 150% scaling, and it's not blurry at all.
4K pixels is not enough at 27" for Retina scaling.
Apple uses 5K panels in their 27" displays for this reason.
There are several very good 27" 5K monitors on the market now around $700 to $800. Not as cheap as the 4K monitors but you have to pay for the pixel density.
There are also driver boards that let you convert 27" 5K iMacs into external monitors. I don't recommend this lightly because it's not an easy mod but it's within reason for the motivated Hacker News audience.
I removed all the computing hardware but kept the Apple power supply, instead of using the cheapo one that came with the LCD driver board I bought. I was able to find the PWM specs for the panel, and installed a cheap PWM module with its own frequency & duty-cycle display to drive it and control brightness.
The result is my daily desktop monitor. Spent way too much time on it, but it works great!
Sadly, it basically never happened. There was the LG display that came out a couple of years later. It didn't have great reviews, and it was like two thirds the cost of an entire 5k iMac.
It took Apple over 7 years to release their standalone 5k display, and there are a few other true 5k displays (1440p screen real estate with quadruple-resolution, not the ultrawide 2160p displays branded as "5k") on the market now with prices just starting to drop below 1,000 USD.
Unfortunately in that time I've gotten used to the screen real estate of the ultrawide 1440p monitors (which are now ubiquitous, and hitting ridiculous sub-$300 prices). As of now, my perfect display for office work (gaming, video/photo work, or heavy media playback are different topics) would be 21:9 with 1440p screen real estate with quadruple-resolution—essentially just a wider version of that original 5k iMac display.
Only thing that holds back that thought lately is, I'm suddenly spending more and more time in multi-pane terminals, and my screen real estate needs have dropped. The only two things I greatly miss now on my laptop is keyboard quality and general comfort (monitor height, etc).
A decade later, it boggles my mind that it's so hard to find a retina-class desktop monitor. The successor to the Cinema Display is basically an iMac, and priced like it. There have very recently been releases from ASUS and BenQ, but it still feels like an underserved niche, rather than standard expectation.
All that is to say: hard cosign.
Why can't it be something simple?
Because monitors aren't simple. There are dozens of axes along which they can be scaled.
They have resolution (1080p FHD, 1440p QHD, 4K, 5K, 6K, 8K), aspect ratio (16:9, 8:5, 4:3, 3:2, 21:9, 32:9), refresh rate (60 Hz, 75 Hz, 120 Hz, 144 Hz, 165 Hz, 240 Hz, 360 Hz, 480 Hz, 1 kHz, and of course adaptive refresh rate tech including G-Sync), colour quality (depth and accuracy), contrast ratios for HDR, panel technology (LCD-TN, LCD-IPS, LCD-VA, OLED, QD-OLED, WOLED, and now RGB stripe OLED), backlight technology (CCFL, edge-lit LED, miniLED, microLED), connectivity (HDMI/DP, USB-B, USB-C, DP alt mode, Thunderbolt, 3.5 mm, and KVMs).
It's very hard to stuff all this information in one neat model number.
On the consumer's part it makes sense to understand these features and what is necessary for one's use case, filter monitors by said features, and note down the model numbers that satisfy the requirements.
There are only ~5 flat-panel manufacturers worldwide: AU Optronics, Innolux, LG Display, Samsung Display, Sharp Display, and recently BOE Display. Apple has to use one of these, even for its bespoke, notched, curved iPhone/iPad displays.
This new 5K 2304-zone panel was developed by LG Display, and is not 'generic white-labelled slop' by any means. It is an extremely good panel in its own right, probably the bleeding edge of LCD technology today achieving top-notch responsiveness, contrast, and colour depth and accuracy.
That MSI monitor will probably retail for ~£800 as will the Asus and LG equivalents, which is not a trivial amount for a monitor. Apple just marked it up 3×, as they are prone to do for anything.
Back in the day (~15 years ago), when 4K monitors were unheard of and even Apple's high-end displays were still 1440p, you could get a bottom-dollar monitor using one of their panels (e.g. Yamakasi Catleap Q270) for about a third of the price. However, it came with no amenities, a single connector (dual-link DVI only), a questionably legal power cable, and no built-in scaling. The vendors, presumably to prevent refunds, even asked for your graphics card model before selling it to you, because it wouldn't work with low-end cards. Oh, and there were very few in the U.S., so you were typically getting them shipped straight from abroad, customs duties and all.
We've definitely come a long way.
After a few years, the "cheap ones" have usually caught up, if you're willing to do the research.
Edge use case I know.
For me though, I am frequently working in different rooms with arbitrary lighting situations. Net effect of the gloss is negative for me unquestionably.
I used to daily drive an apple thunderbolt display (the last non-retina one, 2560x1440). That thing was atrocious. I could often see the reflections of my glasses, or a white glare if I was wearing a white shirt. At nigh, in a dark office (lights off, just whatever came in from the street).
I'm typing this on a matte "ips black" dell ultrasharp something-or-other at 10% brightness, wearing glasses, a white t-shirt, with an overhead light, and see no reflection or glare on my screen. There's no way in hell I'd go back to a shiny screen.
I understand "anti-glare" technology has improved. The most recent apple screen I've tested is an m1 mbp. It seems somewhat better than my 2013 mbp, but still a worse experience than my 2015 (or thereabouts) 24"@4k dell, which is pretty old technology. My 2025 lenovo has a screen that's much more confortable to use inside.
Paradoxically, I'd say the one environment where I prefer my macs to my matte screens is in bright sunlight. Sure, there are more reflections than you can shake a stick at, but there's always an angle where you can see the part of the screen you want. You have to move around, which is obviously annoying, but you can see. The matte screens just turn to mush. Luckily for me, I hate being out in the sun, so I never encounter this situation in practice.
I think the "frost" you're talking about depends a lot on the screen implementation. I tested once an HP model, 27"@4k, and it did have such an effect. Anecdotally, it didn't handle reflections all that well, either. So maybe it's just a question of lower quality product?
OLED smartphones have much higher ppi to deal with this.
https://www.tomshardware.com/monitors/lg-display-reveals-wor...
Not anymore, as long as you make sure that any RGB antialiasing is turned off. Linux defaluts to disabling this and doing only grayscale antialiasing, so it looks great on an OLED out of the box. Windows can be configured to do this.
I got a deal on a used one last year and I love it. It's the only monitor I've used plugged into a MacBook that didn't look notably off (worse) compared to the MacBook's display sitting next to it. Only thing a bit jarring is it's 60Hz but I can live with it.
Asus has picked up the 5k 27" monitor from LG, it's the $730 PA27JCV
I presume you also mean "when used for text heavy work" here, yes? Or do you mean that these displays tire out your eyes even when used "for what they're for", i.e. gaming? (Because that's a very interesting assertion if so, and I'd like to go into depth about it.)
I have an ASUS ProArt Display 27” 5K. And I somewhat regret it.
I love the pixel density. But I don’t love the matte finish. Which is apparently a controversial take. But I really don’t. I like the crisp pop of typography you get with a glossy display. And, for UI design, the matte finish just doesn’t “feel” like the average end-user experience. I am constantly pushing Figma between my laptop display and my monitor to better simulate what a design will look like on an average glossy LCD or OLED display.
I constantly see people saying Apple displays are a terrible value. Last Apple display I had was the Thunderbolt 27 but from now on I'm sticking with Apple.
I've had nothing but issues with non-Apple monitors as well. Customer service ime is non-existent if you need a repair. For something I rely on to get work done, I'm starting to think the premium is worth it.
And somehow they completely forgot how to seamlessly work with displays in general. Connect multiple displays via Thunderbolt? Nope. Keep layouts when switching displays? No. Running any display at more than 60Hz? No. Remember monitor positions? No.
There are even 240Hz displays.
IIRC Apple couldn't get above 60Hz even on third-party displays they explicitly advertised.
Make sure your dock, dongle, and/or cables aren’t bottlenecks.
I've switched docks, dongles, cables, to no avail.
Support also varies a lot between M chips, and Thunderbolt often doesn't support high refresh rates https://support.apple.com/en-us/101571
I can't remember now the actual setup I had, sadly
Both of my LG ultrawides work at 144Hz?
At the time, people were "marveling" at the magic of Apple, and wondering how they did the math to make that display work within bandwidth constraints.
The simple answer was "by completely fucking with DP 1.4 DSC".
I had at the time a 2019 (cheesegrater) Mac Pro. I had two Asus 27" 4K HDR 144Hz monitors, that the Mac had no problems driving under Catalina.
Install Big Sur. Nope. With the monitors advertising DP 1.4, my options were SDR@95Hz, HDR@60Hz. I wasn't the only one, hundreds of people complaining, different monitors, cards, cables.
I could downgrade to Catalina: HDR@144Hz sprung back to life.
Hell, I could on the monitors tell them to advertise DP 1.2 support, which actually improved performance, and I think I got SDR@120Hz, HDR@95Hz (IIRC).
So you don't deserve downvotes on this. Apple absolutely ignored standards and broke functionality for third party screens just to get the Pro Display XDR (which, ironically, I own, although now it's being driven by an M2 Studio, versus the space heater that was the Xeon cheesegrater).