Most active commenters
  • jchw(7)
  • zozbot234(3)

←back to thread

Ancient X11 scaling technology

(flak.tedunangst.com)
283 points todsacerdoti | 17 comments | | HN request time: 0.624s | source | bottom
1. jchw ◴[] No.44371382[source]
Sigh. And now that it's been long enough, everyone will conveniently forget all of the reasons why this wound up being insufficient, and think that all of the desktop environment and toolkit developers are simply stupid. (Importantly, applications actually did do this by default at one point. I remember a wonky-looking nvidia-xsettings because of this.)

The thing X11 really is missing (at least most importantly) is DPI virtualization. UI scaling isn't a feature most display servers implement because most display servers don't implement the actual UI bits. The lack of DPI virtualization is a problem though, because it leaves windows on their own to figure out how to logically scale input and output coordinates. Worse, they have to do it per monitor, and can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling. If anything doesn't do this or does it slightly differently, it will look wrong, and the user has little recourse beyond searching for environment variables or X properties that might make it work.

Explaining all of that is harder than saying that X11 has poor display scaling support. Saying it "doesn't support UI/display scaling" is kind of a misnomer though; that's not exactly the problem.

replies(3): >>44371469 #>>44372117 #>>44378873 #
2. zozbot234 ◴[] No.44371469[source]
> can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling

It's silly that people keep complaining about this. It's a very minor effect, and one that can be solved in principle only by moving to pure vector rendering for everything. Generally speaking, a window will only ever span a single screen. It's convenient to be able to drag a window to a separate monitor, but having that kind of overlap as a permanent feature of one's workflow is just crazy.

> The thing X11 really is missing (at least most importantly) is DPI virtualization.

Shouldn't that kind of DPI virtualization be a concern for toolkits rather than the X server or protocol? As long as X is getting accurate DPI information from the hardware and reporting that to clients, what else is needed?

replies(1): >>44371597 #
3. jchw ◴[] No.44371597[source]
> It's silly that people keep complaining about this. It's a very minor effect, and one that can be solved in principle only by moving to pure vector rendering for everything.

If you have DPI virtualization, a very sufficient solution already exists: pick a reasonable scale factor for the underlying buffer and use it, then resample for any outputs that don't match. This is what happens in most Wayland compositors. Exactly what you pick isn't too important. You could pick whichever output overlaps the most with the window, or the output that has the highest scale factor, or some other criteria. It will not result in perfect pixels everywhere, but it is perfectly sufficient to clean up the visual artifacts.

Another solution would be to simply only present the surface on whatever output it primarily overlaps with. MacOS does this and it's seemingly sufficient. Unfortunately, as far as I understand, this isn't really trivial to do in X11 for the same reasons why DPI virtualization isn't trivial: whether you render it or not, the window is still in that region and will still receive input there.

> Generally speaking, a window will only ever span a single screen. It's convenient to be able to drag a window to a separate monitor, but having that kind of overlap as a permanent feature of one's workflow is just crazy.

The issue with the overlap isn't that people routinely need this; if they did, macOS or Windows would also need a more complete solution. In reality though, it's just a very janky visual glitch that isn't really too consequential for your actual workflow. Still, it really can make moving windows across outputs super janky, especially since in practice different applications do sometimes choose different behaviors. (e.g. will your toolkit choose to resize the window so it has the same logical size? will this impact the window dragging operation?)

So really, the main benefit of solving this particular edge case is just to make the UX of window management better.

While UX and visual jank concerns are below concerns about functionality, I still think they have non-zero (and sometimes non-low) importance. Laptop users expect to be able to dock and manage windows effectively regardless of whether the monitors they are using have the same ideal scale factor as the laptop's internal panel; the behavior should be clean and effective and legacy apps should ideally at least appear correct even if blurry. Being able to do DPI virtualization solves the whole set of problems very cleanly. MacOS is doing this right, Windows is finally doing this right, Wayland is doing this right, X11 still can't yet. (It's not physically impossible, but it would require quite a lot of work since it would require modifying everything that handles coordinate spaces I believe.)

> Shouldn't that kind of DPI virtualization be a concern for toolkits rather than the X server or protocol? As long as X is getting accurate DPI information from the hardware and reporting that to clients, what else is needed?

Accurate DPI information is insufficient as users may want to scale differently anyways, either due to preference, higher viewing distance, or disability. So that already isn't enough.

That said, the other issue is that there already exists applications that don't do perfect per monitor scaling, and there doesn't exist a single standard way to have the per-monitor scaling preferences propagated in X11. It's not even necessarily a solved problem among the latest versions of all of the toolkits, since it at minimum requires support for desktop environment settings daemons and etc.

replies(4): >>44371986 #>>44373401 #>>44373499 #>>44378943 #
4. BearOso ◴[] No.44371986{3}[source]
I think having any kind of "scaling" preferences focuses too much on the technical aspect. It could be narrowed down to one setting like "zoom level" or just "size." This would mean that all UI elements change size exactly proportionately to one another. Ideally, rendering should happen at the exact resolution of the display, and scaling, as in resizing a bitmap using bilinear interpolation or whatever, doesn't need to be part of the pipeline except for outdated legacy programs.

In the past, the problem with UI toolkits doing proportional sizing was because they used bitmaps for UI elements. Since newer versions of Qt and Gtk 4 render programmatically, they can do it the right way. Windows mostly does this, too, even with win32 as long as you're using the newer themes. MacOS is the only one that has assets prerendered at integer factors everywhere and needs to perform framebuffer scaling to change sizes. But Apple doesn't care because they don't want you using third-party monitors anyway.

Edit: I'm not sure about Apple's new theme. Maybe this is their transition point away from fixed asset sizes.

replies(2): >>44372310 #>>44372769 #
5. jeffbee ◴[] No.44372117[source]
I sort of wanted Fresco (previously Berlin, inspired by InterViews) to succeed, because in their model the UI toolkits really were server-side and they could be changed out while the application was running. Because they were targeting an abstract device (could be a 1200 dpi printer and a 72 dpi display at the same time) they got the property you mentioned, for free.
6. zozbot234 ◴[] No.44372310{4}[source]
> Windows mostly does this, too, even with win32 as long as you're using the newer themes.

Win32 controls have always been DPI independent, as far back as Windows 95. There is DPI choice UX as part of the "advanced" display settings.

replies(1): >>44378721 #
7. jchw ◴[] No.44372769{4}[source]
Using vector pipelines isn't new, of course: Windows has been doing DPI-independent rendering since almost the beginning with GDI. The actual issue with GDI's scaling is all about text: for something to be "scalable" it has to maintain its proportions when the scale factor changes, but this was not the case for text in Win32/GDI, due to pixel grid fitting. Because of this, it was common in the Windows XP era to see ill-sized text when changing the DPI to anything other than 96, resulting in things being cut off and generally broken. Also, although the rendering itself was DPI-independent and scalable, that doesn't mean that applications would properly handle scalable rendering themselves, when they do things like deal with pixels directly or what have you. If you did this again today, you could almost certainly account for this and make an API much harder to misuse. HTML applications really have to try to not be resolution-independent, for example.

In practice Windows and macOS both do bitmap scaling when necessary. macOS scales the whole frame buffer, Windows scales windows individually.

Can you do an entire windowing pipeline where it's vectors all the way until the actual compositing? Well, sure! We were kind of close in the pre-compositing era sometimes. Is it worth it to do so? I don't think so for now. Most desktop displays are made up of standard-ish pixels so buffers full of pixels makes a very good primitive. So making the surfaces themselves out of pixels seems like a fine approach, and the scaling problem is relatively easy to solve if you start with a clean slate. The fact that it can handle the "window splitting across outputs" case slightly better is not a particularly strong draw; I don't believe most users actually want to use windows split across outputs, it's just better UX if things at least appear correct. Same thing for legacy apps, really: if you run an old app that doesn't support scaling it's still better for it to work and appear blurry than to be tiny and unusable.

What to make of this. Well, the desktop platform hasn't moved so fast; ten years of progress has become little more than superficial at this point. So I think we can expect with relatively minor concessions that barring an unforeseen change, desktops we use 10 to 20 years from now probably won't be that different from what we have today; what we have today isn't even that different from what we already had 20 years ago as it is. And you can see that in people's attitudes; why fix what isn't broken? That's the sentiment of people who believe in an X11 future. Of course in practice, there's nothing particularly wrong with trying to keep bashing X11 into modernity; with much pain they definitely managed to take X.org and make it shockingly good. Ironically, if some of the same people working on Wayland today had put less work into keeping X.org working well, the case for Wayland would be much stronger by now. Still, I really feel like roughly nobody actually wants to sit there and try to wedge HDR or DPI virtualization into X11, and retooling X11 without regard for backwards compatibility is somewhat silly since if you're going to break old apps you may as well just start fresh. Wayland has always had tons of problems yet I always bet on it as the most likely option simply because it just makes the most sense to me and I don't see any showstoppers that seem like they would be insurmountable. Lo and behold, it sure seems to me that the issues remaining for Wayland adoption have started to become more and more minor. KDE maintains a nice list of more serious drawbacks. It used to be a whole hell of a lot larger!

https://community.kde.org/Plasma/Wayland_Known_Significant_I...

replies(1): >>44374152 #
8. archy_ ◴[] No.44373401{3}[source]
>users may want to scale differently anyways

Users think they want a lot of things they don't really need. Do we really want to hand users that loaded gun so that they can choose incorrectly where to fire?

replies(1): >>44376340 #
9. kelnos ◴[] No.44373499{3}[source]
> Accurate DPI information is insufficient as users may want to scale differently anyways, either due to preference, higher viewing distance, or disability.

Which is fine. There's already a standardized property in XSETTINGS to use on X11 to advertise the user's scaling preference. For Wayland they decided to include this into the protocol, so it can be per-output and/or per-window (though the per-window fractional scaling stuff is an optional extension, sigh).

There's no reason why we couldn't do something similarly on X11, via xrandr output properties and X window properties. But it's more fun to abandon things and invent new ones than fix the things you have, so here we are.

replies(1): >>44376283 #
10. zozbot234 ◴[] No.44374152{5}[source]
> Because of this, it was common in the Windows XP era to see ill-sized text when changing the DPI to anything other than 96, resulting in things being cut off and generally broken.

The underlying issue with this is the use of fixed-layout interfaces in Win32. If you tweak the layout dynamically to be "responsive" to how the text wraps, this becomes an absolute non-issue. It could also be done with reasonable efficiency at the time; early versions of KDE/Qt already did this out of the box on the same hardware as Win9x.

replies(1): >>44376269 #
11. jchw ◴[] No.44376269{6}[source]
While that is a solution for this specific issue, it doesn't solve everything. It's better to both have proper layout and proper proportional fonts at the same time.
12. jchw ◴[] No.44376283{4}[source]
The issue is that if you want DPI virtualization, and all of the desktops do want that, you need to be able to translate all of the coordinate spaces everywhere. If you try to integrate this into X11, you run into a myriad of different problems that are just not that easy to solve. It was direct inspiration for some of the design elements in Wayland.

The same folks who are working on Wayland today did a lot of work to get X.org to where it is now. They could do more, but the writing was on the wall.

13. jchw ◴[] No.44376340{4}[source]
No. This is an actual real issue.

For example, if I'm using KDE on a TV, which by the way I am (with Bazzite to be exact, works great) then I want to set the scale factor in KDE higher because I'm going to be standing further away. This is not optional; the UI is completely unreadable if you just let it use the physical dimensions to scale. There's nothing you can do. A preference is necessary to handle this case.

You could argue that this is a PEBKEC ignoring the fact that desktop environments care about this use case, but what you can't argue about is this: it's an accessibility issue. Having a magnifier tool is very important for people who have vision issues, but it is not enough. Users with vision problems need to be able to scale the UI. And yes, the UI, not text size. Changing the text size helps for text, but not for things like icons.

If you want to be able to sell Linux on devices in the EU, then having sufficient accessibility features is not optional.

14. BearOso ◴[] No.44378721{5}[source]
Yeah, I'm aware it always used font units for sizing. I'm referring to the actual drawing code. Classic used single pixel lines. XP through 7 used 8-way sliced bitmaps. Windows 8 and above's simple flat theme renders everything to scale correctly.
15. account42 ◴[] No.44378873[source]
Applications (or rather UI toolkits) need to handle scaling no matter what if you want a crisp result without giant intermediate renders. Figuring out the scale is the easy part as the article shows.

> Worse, they have to do it per monitor, and can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling.

That is not a real issue. Certainly not anything worth breaking backwards compatibility and even if you care about cosmetic issues like this you can fix them with extensions.

replies(1): >>44380973 #
16. account42 ◴[] No.44378943{3}[source]
> If you have DPI virtualization, a very sufficient solution already exists: pick a reasonable scale factor for the underlying buffer and use it, then resample for any outputs that don't match.

That's a shitty "solution" that doesn't even solve the issue - the result will still look bad on at least one monitor and you're wasting energy pushing more pixels than needed on the other one.

17. jchw ◴[] No.44380973[source]
That's fine. It doesn't need to be absolutely perfect, it just needs to be good enough. What isn't good enough is if you can't even use legacy apps because they're too small. What is good enough is if DPI aware apps can have crisp rendering and DPI-unaware apps can render at the correct size on screen but blurry. Totally fine, it's exactly what Windows and macOS (and not X11) do.

You can fix this with extensions... Kind of, anyway. It's really not that trivial. Like if you do DPI virtualization, you need it to take effect across literally everything. Like for example, some applications in X11 will read xrandr information for window placement. To properly handle DPI-unaware applications, you need to be able to present virtualized coordinates to some applications. This is actually one of the easier problems to solve, it goes downhill from there.