←back to thread

Ancient X11 scaling technology

(flak.tedunangst.com)
284 points todsacerdoti | 1 comments | | HN request time: 0.21s | source
Show context
pedrocr ◴[] No.44369891[source]
That's probably better than most scaling done on Wayland today because it's doing the rendering directly at the target resolution instead of doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux. If you do it that way you both lose performance and get blurry output. The only corner case a compositor needs to cover is when a client is straddling two outputs. And even in that case you can render at the higher size and get perfect output in one output and the same downside in blurryness in the other, so it's still strictly better.

It's strange that Wayland didn't do it this way from the start given its philosophy of delegating most things to the clients. All you really need to do arbitrary scaling is tell apps "you're rendering to a MxN pixel buffer and as a hint the scaling factor of the output you'll be composited to is X.Y". After that the client can handle events in real coordinates and scale in the best way possible for its particular context. For a browser, PDF viewer or image processing app that can render at arbitrary resolutions not being able to do that is very frustrating if you want good quality and performance. Hopefully we'll be finally getting that in Wayland now.

replies(12): >>44370069 #>>44370123 #>>44370577 #>>44370717 #>>44370769 #>>44371423 #>>44371694 #>>44372948 #>>44373092 #>>44376209 #>>44378050 #>>44381061 #
1. hedora ◴[] No.44378050[source]
I’ll just add that it is much better than fractional scaling.

I switched to high dpi displays under Linux back in the late 1990’s. It worked great, even with old toolkits like xaw and motif, and certainly with gtk/gnome/kde.

This makes perfect sense, since old unix workstations tended to have giant (for the time) frame buffers, and CRTs that were custom-built to match the video card capabilities.

Fractional scaling is strictly worse than the way X11 used to work. It was a dirty hack when Apple shipped it (they had to, because their third party software ecosystem didn’t understand dpi), but cloning the approach is just dumb.