←back to thread

Ancient X11 scaling technology

(flak.tedunangst.com)
283 points todsacerdoti | 2 comments | | HN request time: 0.001s | source
Show context
jchw ◴[] No.44371382[source]
Sigh. And now that it's been long enough, everyone will conveniently forget all of the reasons why this wound up being insufficient, and think that all of the desktop environment and toolkit developers are simply stupid. (Importantly, applications actually did do this by default at one point. I remember a wonky-looking nvidia-xsettings because of this.)

The thing X11 really is missing (at least most importantly) is DPI virtualization. UI scaling isn't a feature most display servers implement because most display servers don't implement the actual UI bits. The lack of DPI virtualization is a problem though, because it leaves windows on their own to figure out how to logically scale input and output coordinates. Worse, they have to do it per monitor, and can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling. If anything doesn't do this or does it slightly differently, it will look wrong, and the user has little recourse beyond searching for environment variables or X properties that might make it work.

Explaining all of that is harder than saying that X11 has poor display scaling support. Saying it "doesn't support UI/display scaling" is kind of a misnomer though; that's not exactly the problem.

replies(3): >>44371469 #>>44372117 #>>44378873 #
zozbot234 ◴[] No.44371469[source]
> can't do anything about the fact that part of the window will look wrong if it overlaps two displays with different scaling

It's silly that people keep complaining about this. It's a very minor effect, and one that can be solved in principle only by moving to pure vector rendering for everything. Generally speaking, a window will only ever span a single screen. It's convenient to be able to drag a window to a separate monitor, but having that kind of overlap as a permanent feature of one's workflow is just crazy.

> The thing X11 really is missing (at least most importantly) is DPI virtualization.

Shouldn't that kind of DPI virtualization be a concern for toolkits rather than the X server or protocol? As long as X is getting accurate DPI information from the hardware and reporting that to clients, what else is needed?

replies(1): >>44371597 #
jchw ◴[] No.44371597[source]
> It's silly that people keep complaining about this. It's a very minor effect, and one that can be solved in principle only by moving to pure vector rendering for everything.

If you have DPI virtualization, a very sufficient solution already exists: pick a reasonable scale factor for the underlying buffer and use it, then resample for any outputs that don't match. This is what happens in most Wayland compositors. Exactly what you pick isn't too important. You could pick whichever output overlaps the most with the window, or the output that has the highest scale factor, or some other criteria. It will not result in perfect pixels everywhere, but it is perfectly sufficient to clean up the visual artifacts.

Another solution would be to simply only present the surface on whatever output it primarily overlaps with. MacOS does this and it's seemingly sufficient. Unfortunately, as far as I understand, this isn't really trivial to do in X11 for the same reasons why DPI virtualization isn't trivial: whether you render it or not, the window is still in that region and will still receive input there.

> Generally speaking, a window will only ever span a single screen. It's convenient to be able to drag a window to a separate monitor, but having that kind of overlap as a permanent feature of one's workflow is just crazy.

The issue with the overlap isn't that people routinely need this; if they did, macOS or Windows would also need a more complete solution. In reality though, it's just a very janky visual glitch that isn't really too consequential for your actual workflow. Still, it really can make moving windows across outputs super janky, especially since in practice different applications do sometimes choose different behaviors. (e.g. will your toolkit choose to resize the window so it has the same logical size? will this impact the window dragging operation?)

So really, the main benefit of solving this particular edge case is just to make the UX of window management better.

While UX and visual jank concerns are below concerns about functionality, I still think they have non-zero (and sometimes non-low) importance. Laptop users expect to be able to dock and manage windows effectively regardless of whether the monitors they are using have the same ideal scale factor as the laptop's internal panel; the behavior should be clean and effective and legacy apps should ideally at least appear correct even if blurry. Being able to do DPI virtualization solves the whole set of problems very cleanly. MacOS is doing this right, Windows is finally doing this right, Wayland is doing this right, X11 still can't yet. (It's not physically impossible, but it would require quite a lot of work since it would require modifying everything that handles coordinate spaces I believe.)

> Shouldn't that kind of DPI virtualization be a concern for toolkits rather than the X server or protocol? As long as X is getting accurate DPI information from the hardware and reporting that to clients, what else is needed?

Accurate DPI information is insufficient as users may want to scale differently anyways, either due to preference, higher viewing distance, or disability. So that already isn't enough.

That said, the other issue is that there already exists applications that don't do perfect per monitor scaling, and there doesn't exist a single standard way to have the per-monitor scaling preferences propagated in X11. It's not even necessarily a solved problem among the latest versions of all of the toolkits, since it at minimum requires support for desktop environment settings daemons and etc.

replies(4): >>44371986 #>>44373401 #>>44373499 #>>44378943 #
BearOso ◴[] No.44371986[source]
I think having any kind of "scaling" preferences focuses too much on the technical aspect. It could be narrowed down to one setting like "zoom level" or just "size." This would mean that all UI elements change size exactly proportionately to one another. Ideally, rendering should happen at the exact resolution of the display, and scaling, as in resizing a bitmap using bilinear interpolation or whatever, doesn't need to be part of the pipeline except for outdated legacy programs.

In the past, the problem with UI toolkits doing proportional sizing was because they used bitmaps for UI elements. Since newer versions of Qt and Gtk 4 render programmatically, they can do it the right way. Windows mostly does this, too, even with win32 as long as you're using the newer themes. MacOS is the only one that has assets prerendered at integer factors everywhere and needs to perform framebuffer scaling to change sizes. But Apple doesn't care because they don't want you using third-party monitors anyway.

Edit: I'm not sure about Apple's new theme. Maybe this is their transition point away from fixed asset sizes.

replies(2): >>44372310 #>>44372769 #
1. zozbot234 ◴[] No.44372310{4}[source]
> Windows mostly does this, too, even with win32 as long as you're using the newer themes.

Win32 controls have always been DPI independent, as far back as Windows 95. There is DPI choice UX as part of the "advanced" display settings.

replies(1): >>44378721 #
2. BearOso ◴[] No.44378721[source]
Yeah, I'm aware it always used font units for sizing. I'm referring to the actual drawing code. Classic used single pixel lines. XP through 7 used 8-way sliced bitmaps. Windows 8 and above's simple flat theme renders everything to scale correctly.