Does anyone here run any of these OSes on any of the offered screen densities and resolutions? I have questions.
- Surely not every old program available from the repositories will work with display scaling, will they? I've never had to use it, maybe Xorg has some hack to scale certain windows at 2x so you don't need support from individual packages. I'm also thinking of things like Burp Suite, which have window-like objects that interact horribly with things like i3 (from what I see with colleagues), so those might have similar issues when you have to tell Xorg to scale individual windows up. Is this something you run into?
- How much of an impact does that resolution have on battery life and GPU performance? I do not need more than 1920x1080 pixels on a 16" diagonal, my eyes can hardly read small fonts on that DPI as it is (I'm ~30) and they're not going to get better with age. This laptop ships with either double or quadruple that, making me wonder what the trade-off is like of having this (for me) gimmick. Surely it doesn't double/quadruple the battery drain or halve/quarter the performance compared to a normal screen?
- I also don't see flickering at 60 Hz (heck, 24 Hz TV looks smooth to me), so 165 Hz seems again like a battery drainer and performance reducer. How much of an impact does it have to try and render 2.75x as many frames per second on a GPU? Does it simply use 2.75x more power for the GPU or reduce the number of drawing operations you can do on the GPU by 2.75 times, or does this not work that way?
Edit: this is currently at the top, but I don't want the top comment to be criticism. This product is awesome in virtually every other regard besides shipping time. Good physical size, decent number of USB ports, customization of the keyboard, c-c-coreboot?! Officially supported? I am definitely impressed. Heck, even the payment methods impress me, being able to select iDeal at a small foreign shop.
Regarding scaling: If you use 2x scaling it should be easy with any distribution. Fractional scaling is a bit trickier to get.
I am using a Framework with 1.5x scaling using Fedora KDE and it’s amazing. Didn’t find any app yet that doesn’t conform. One difference to years ago is Wayland vs X. With X it was a constant struggle for me while with Wayland and more years invested, scaling became a non-issue on Linux (for me).
Regarding burp suite, iirc this is a JVM based app. I am running Jetbrains products without any issues and no configuration needs. Assuming burp suite uses swing, I would assume no issue. Generally, you can quickly check with a VM. Using Fedora KDE is a great „Just Works“ experience.
Xorg has a lot of accumulated hacks that make scaling work ok-ish. It falls apart when you have multiple monitors with different scaling, but for a laptop, just close the lid when you dock it and it should be good enough.
Wayland, after much feet dragging (why this wasn't a day-1 feature for a supposed Xorg replacement is beyond me), finally managed to cobble together basic support for non-integer scaling [0], so it should finally Just Werk (tm), regardless of if you scale at 2x, 1.75x, etc. without looking like a blurry mess.
I don't have experience with 4k displays in laptops, but I will say this: considering AMD's ongoing problems with idle power draw on >120 Hz displays [1], I'd recommend not getting the 165 Hz display if you're getting an AMD CPU.
> It falls apart when you have multiple monitors with different scaling
Yeah that's my colleagues! This is why half the monitors in the office are not being used :D. Someone thought it was a great idea to get three or four 4k screens but only one person actually wants them; everyone wants their laptop as a second or third screen. (Personally I'm a single-screen type of person anyway, but what made me commandeer a 1080p screen is the very noticeable lag that my 2018 i5 Lenovo had when trying to drive a 4k screen with or without display scaling. Got a new work laptop now that ought to not have that problem, but I haven't bothered trying yet.)
Anyhow, thanks for the pointers! Especially that 120 Hz AMD thing sounds like a big caveat.
Running an xrandr script with ’--scale’ when you plug in the externals works okay for me. I have a 4k laptop monitor and 2 X 1k externals. It's perfectly fine for work use cases.
Couldn't you just feed a 4k screen a 1080p resolution? It's a bit annoying if it defaults to 4K every time you plug in I guess. If you can use HDMI, you could get an inline EDID adapter.
Fractional scaling in Wayland still needs some work in my experience. Primarily, apps running through XWayland look blurry which sucks because there’s a decent number that font natively support Wayland yet (or have it behind an experimental flag with caveats, like how Anki loses its native titlebar and window shadow when Wayland support is turned on).
- AFAIK, it's usually the toolkits that do the scaling. So, it's indeed possible that very old apps, if they're still using old versions of the toolkits, don't support scaling. They'll appear non-scaled, so "at 100%".
- I don't have a 4k display on a laptop so can't comment on battery life. But for "desktop use" (read: non gaming) GPU performance has been fine for a long time. I have an old desktop at work with a 4th gen i5 and whatever the integrated GPU was at the time. It can drive a 4k panel at 60 Hz just fine. A somewhat newer laptop, 8th gen i5 with a uhd620 integrated GPU could drive its internal FHD panel and an external UHD display without any issue.
- For your eyes comment: the small fonts may be illegible because they're blurry. On FHD screens, I've found that bitmap fonts are much more legible at small sizes. I've seen some Dell with a 4k display at work, probably 15", and the small text was much more legible than on my 14" FHD laptop (compared using Windows 11 - the guy was a Windows dev).
- TV has a blurriness to its movement, so it looks smooth enough because it's never actually sharp. The point of higher refresh rates is not "flickering", but a smooth movement. Try reading a scrolling page on a 30 Hz, 60 Hz, 120 Hz screen. I mostly look at static text on my screens, so 60 Hz works well enough for me, and I prefer higher resolution / better colors to higher refresh rates. Don't know how this affects GPU usage, though I don't expect it to be "free".
I only have experience with this on Linux, where at least it supports scaling, so that if you use a constant one (my case), then it's OK.
Older ones will simply ignore the scaling settings and draw the interface 1-to-1. One such application that comes to mind is VMWare's remote console (for esx). I haven't used it in a few years, but I remember at the time it was painful to run on a 24" UHD screen.
On the windows side, I think things are somewhat better than on Linux, but there still is confusion, including Windows 11 22h2's start menu. If you start the computer in 100% mode, then plug an external screen scaled at 200%, it works OK for the app list (what it shows on first click) but if you start typing everything becomes a blurry mess.
---
Edit: I actually think QT is one of the better toolkits, at least on Linux, in the case of scaling. IIRC it's able to adapt the scaling based on the screen DPI reported by X, so a full qt desktop should be able to handle situations like a high-DPI laptop connected to a low-dpi monitor.
It's not that it ignores it, it's that it tried to handle it and gets confused. I move fairly frequently between my 3x laptop screen and a 1x external monitor, and at this point I've got used to either the app logo randomly being a third the size it should be in the start bar or the text rendering three times as big as it should in the app.
I have a bad left eye. I haven’t noticed increased readability from higher dpi screens. I went from a higher dpi to a 14” 1080 screen and notice no difference at equivalent font size. Of course that’s still pretty high pixel density.
Higher pixel density can help a bit if your vision is bad enough that you use a lot of screen magnification, as it means that the magnified text will be better-rounded and fuller, rather than almost unrecognizably pixelated.
For some visual impairments, though, the improved sharpness from higher DPI may be basically undetectable, which sounds like what you're reporting.
> This laptop ships with either double or quadruple that, making me wonder what the trade-off is like of having this (for me) gimmick. Surely it doesn't double/quadruple the battery drain or halve/quarter the performance compared to a normal screen?
From their configuration site:
"The 4K display consumes more power, averaging 8W but provides incredible detail and excellent scaling support that allows you to change the UI (User Interface) to a size that's comfortable for almost everyone.
The QHD display supports a refresh rate of 165Hz, which offers a silky smooth experience. It consumes less than half the power of the 4K display at 3.2W. Limited scaling support on Linux means that the UI on this display is relatively small compared to other display resolutions"
It seems like the QHD display would be the way to go for lower power. I'd guess the power would be lower if you didn't run it at the full 165Hz refresh (there's probably a 60Hz mode)...
It's not the screen that I was afraid of so much as the processing power. From my understanding, the main power draw of a screen comes from its light output and area size (they advertise with about double or triple the nits mine has) rather than from how many pixels it has. Regardless, it's a good point that I should not ignore the screen while considering the processing power needed to drive said screen!
Well, there is the backlight, but also the power usage of the TCON (timing controller board) for each display that will vary greatly. Usually the 4K ones end up being less efficient. If you get the 4K display, you can of course set the output to 1080p which would solve the "processing power" end. I think the difference between 1080p and 1440p (or their 16:10 equivalents) at 60Hz would be negligible from a GPU perspective (especially if PSR is set on), but ultimately you'd have to test the two different models with different resolution settings to really be able to tell.
With imperfect eye sight, I find it much easier to read text with higher DPI. For me 11-12” is the limit for 1080p. At 16” I’d want at least 1440p. Even 4K starts getting blocky above 24” or so, 27” is barely ok.
Display refresh rate isn't necessarily just about one aspect like flicker or smoothness.
For one thing, it affects the latency from human input to graphics output. How the graphics stack is implemented, especially with modern desktop compositors, there's typically at least 3 frames of latency. 3/60 is 50ms. 3/165 is 18ms. Whether or not you consciously notice it, the 165hz display is going to feel more instant when you push a button.
There's also what's referred to as "judder". When you're watching 24Hz video content on a 60Hz display, the frames of video get repeated 3 times, then 2 times, then 3 times etc. This results in a 16ms "judder" from frame to frame. It's subtle, and has been the norm for decades, but it is quantifiably less than ideal. A 165Hz display drops judder down to 6ms.
Another aspect of the refresh rate has to do with the frequency response of the display technology itself, and what many might call "smoothness". Looking back at CRT technology, the image is instantaneous wherever the electron beam is currently pointed. The overall image looks stable due to persistence within the human eye. If you film a CRT, it can look pretty wonky. With a CRT, 30Hz is too slow because pretty much everyone can see the flicker. 60Hz is borderline on a CRT, and I personally can see the flicker in my peripheral vision. Motion looks smooth regardless, though, because all the persistence is in your eyes. With traditional LCDs, the pixels are always on, and they are relatively slow to change; there's persistence in the display itself. So, 30Hz doesn't flicker, and all motion looks blurry no matter what. It just sucks other than being a conveniently flat screen. With modern LCDs and OLED displays, the pixels are still always on, but they are back to being very fast to change. So, 30Hz doesn't flicker, but motion is no longer blurry, but instead of flickering it looks jerky rather than smooth. At 60Hz things look pretty smooth, but you're still at the limit of some folk's peripheral vision.
A GPU doesn't have to render frames at the refresh rate. An old frame will be repeated if there isn't a new frame ready yet. If the GPU can't keep up, a 165Hz display effectively becomes an 82.5Hz display, or a 41.25Hz display. There certainly is going to be a power penalty in the GPU circuitry driving the display at a higher rate, but it's marginal vs. the cost of rendering the frames themselves. 82Hz is still luxury compared to 60Hz, in that it's better than good enough for 99% of people.
What it boils down to is that pushing the refresh rate higher gives the GPU/software more fine grained control over the display than otherwise. That control allows the software to optimize latency, judder, and smoothness better than the display itself can given a lower refresh rate.
- Surely not every old program available from the repositories will work with display scaling, will they? I've never had to use it, maybe Xorg has some hack to scale certain windows at 2x so you don't need support from individual packages. I'm also thinking of things like Burp Suite, which have window-like objects that interact horribly with things like i3 (from what I see with colleagues), so those might have similar issues when you have to tell Xorg to scale individual windows up. Is this something you run into?
- How much of an impact does that resolution have on battery life and GPU performance? I do not need more than 1920x1080 pixels on a 16" diagonal, my eyes can hardly read small fonts on that DPI as it is (I'm ~30) and they're not going to get better with age. This laptop ships with either double or quadruple that, making me wonder what the trade-off is like of having this (for me) gimmick. Surely it doesn't double/quadruple the battery drain or halve/quarter the performance compared to a normal screen?
- I also don't see flickering at 60 Hz (heck, 24 Hz TV looks smooth to me), so 165 Hz seems again like a battery drainer and performance reducer. How much of an impact does it have to try and render 2.75x as many frames per second on a GPU? Does it simply use 2.75x more power for the GPU or reduce the number of drawing operations you can do on the GPU by 2.75 times, or does this not work that way?
Edit: this is currently at the top, but I don't want the top comment to be criticism. This product is awesome in virtually every other regard besides shipping time. Good physical size, decent number of USB ports, customization of the keyboard, c-c-coreboot?! Officially supported? I am definitely impressed. Heck, even the payment methods impress me, being able to select iDeal at a small foreign shop.