Most active commenters
  • cycomanic(5)
  • pedrocr(4)
  • sho_hn(3)
  • account42(3)

←back to thread

Ancient X11 scaling technology

(flak.tedunangst.com)
283 points todsacerdoti | 22 comments | | HN request time: 0.001s | source | bottom
Show context
pedrocr ◴[] No.44369891[source]
That's probably better than most scaling done on Wayland today because it's doing the rendering directly at the target resolution instead of doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux. If you do it that way you both lose performance and get blurry output. The only corner case a compositor needs to cover is when a client is straddling two outputs. And even in that case you can render at the higher size and get perfect output in one output and the same downside in blurryness in the other, so it's still strictly better.

It's strange that Wayland didn't do it this way from the start given its philosophy of delegating most things to the clients. All you really need to do arbitrary scaling is tell apps "you're rendering to a MxN pixel buffer and as a hint the scaling factor of the output you'll be composited to is X.Y". After that the client can handle events in real coordinates and scale in the best way possible for its particular context. For a browser, PDF viewer or image processing app that can render at arbitrary resolutions not being able to do that is very frustrating if you want good quality and performance. Hopefully we'll be finally getting that in Wayland now.

replies(12): >>44370069 #>>44370123 #>>44370577 #>>44370717 #>>44370769 #>>44371423 #>>44371694 #>>44372948 #>>44373092 #>>44376209 #>>44378050 #>>44381061 #
kccqzy ◴[] No.44370123[source]
> doing the "draw at 2x scale and then scale down" dance that was popularized by OSX

Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale. The earliest retina MacBook Pro in 2012 for example was 2x in both width and height of the earlier non-retina MacBook Pro.

Eventually I guess the cost of the hardware made this too hard. I mean for example how many different SKUs are there for 27-inch 5K LCD panels versus 27-inch 4K ones?

But before Apple committed to integer scaling factors and then scaling down, it experimented with more traditional approaches. You can see this in earlier OS X releases such as Tiger or Leopard. The thing is, it probably took too much effort for even Apple itself to implement in its first-party apps so Apple knew there would be low adoption among third party apps. Take a look at this HiDPI rendering example in Leopard: https://cdn.arstechnica.net/wp-content/uploads/archive/revie... It was Apple's own TextEdit app and it was buggy. They did have a nice UI to change the scaling factor to be non-integral: https://superuser.com/a/13675

replies(4): >>44370977 #>>44371108 #>>44374789 #>>44375798 #
pedrocr ◴[] No.44370977[source]
> Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale.

That's an interesting related discussion. The idea that there is a physically correct 2x scale and fractional scaling is a tradeoff is not necessarily correct. First because different users will want to place the same monitor at different distances from their eyes, or have different eyesight, or a myriad other differences. So the ideal scaling factor for the same physical device depends on the user and the setup. But more importantly because having integer scaling be sharp and snapped to pixels and fractional scaling a tradeoff is mostly a software limitation. GUI toolkits can still place all ther UI at pixel boundaries even if you give them a target scaling of 1.785. They do need extra logic to do that and most can't. But in a weird twist of destiny the most used app these days is the browser and the rendering engines are designed to output at arbitrary factors natively and in most cases can't because the windowing system forces these extra transforms on them. 3D engines are another example, where they can output whatever arbitrary resolution is needed but aren't allowed to. Most games can probably get around that in some kind of fullscreen mode that bypasses the scaling.

I think we've mostly ignored these issues because computers are so fast and monitors have gotten so high resolution that the significant performance penalty (2x easily) and introduced blurryness mostly goes unnoticed.

> Take a look at this HiDPI rendering example in Leopard

That's a really cool example, thanks. At one point Ubuntu's Unity had a fake fractional scaling slider that just used integer scaling plus font size changes for the intermediate levels. That mostly works very well from the point of view of the user. Because of the current limitations in Wayland I mostly do that still manually. It works great for single monitor and can work for multiple monitors if the scaling factors work out because the font scaling is universal and not per output.

replies(2): >>44371039 #>>44371226 #
sho_hn ◴[] No.44371039[source]
What you want is exactly how fractional scaling works (on Wayland) in KDE Plasma and other well-behaved Wayland software: The scale factor can be something quirky like your 1.785, and the GUI code will generally make sure that things nevertheless snap to the pixel grid to avoid blurry results, as close to the requested scaling as possible. No "extra window system transforms".
replies(5): >>44371141 #>>44371886 #>>44371928 #>>44373804 #>>44380686 #
pedrocr ◴[] No.44371141[source]
That's what I referred to with "we'll be finally getting that in Wayland now". For many years the Wayland protocol could only communicate integer scale factors to clients. If you asked for 1.5 what the compositors did was ask all the clients to render at 2x at a suitably fake size and then scale that to the final output resolution. That's still mostly the case in what's shipping right now I believe. And even in integer scaling things like events are sent to clients in virtual coordinates instead of just going "here's your NxM buffer, all events are in those physical coordinates, all scaling is just metadata I give you to do whatever you want with". There were practical reasons to do that in the beginning for backwards compatibility but the actual direct scaling is having to be retrofitted now. I'll be really happy when I can just set 1.3 scaling in sway and have that just mean that sway tells Firefox that 1.3 is the scale factor and just gets back the final buffer that doesn't need any transformations. I haven't checked very recently but it wasn't possible not too long ago. If it is now I'll be a happy camper and need to upgrade some software versions.
replies(2): >>44371184 #>>44371338 #
1. zokier ◴[] No.44371184[source]
> That's still mostly the case in what's shipping right now I believe

All major compositors support fractional scaling extension these days which allows pixel perfect rendering afaik, and I believe Qt6 and GTK4 also support it.

https://wayland.app/protocols/fractional-scale-v1#compositor...

replies(3): >>44371283 #>>44371530 #>>44384785 #
2. pedrocr ◴[] No.44371283[source]
Seems like the support is getting there. I just checked Firefox and it has landed the code but still has it disabled by default. Most users that set 1.5x on their session are probably still getting needless scaling but hopefully that won't last too long.
replies(1): >>44373915 #
3. cycomanic ◴[] No.44371530[source]
That's great, however why do we use a "scale factor" in the first place? We had a perfectly fitting metric in DPI, why can't I set the desired DPI for every monitor, but instead need to calculate some arbitrary scale factor?

I'm generally a strong wayland proponent and believe it's a big step forward over X in many ways, but some decisions just make me scratch my head.

replies(4): >>44371804 #>>44371885 #>>44372290 #>>44373450 #
4. sho_hn ◴[] No.44371804[source]
The end-user UIs don't ask you to calculate anything. Typically they have a slider from 100% to, say, 400% and let you set this to something like 145%.

This may take some getting used to if you're familiar with DPI and already know the value you like, but for non-technical users it's more approachable. Not everyone knows DPI or how many dots they want to their inches.

That the 145% is 1.45 under the hood is really an implementation detail.

replies(2): >>44372515 #>>44372836 #
5. zokier ◴[] No.44371885[source]
DPI (or PPI) is an absolute measurement. Scale factor is intentionally relative. Different circumstances will want to have different scale factor : dpi ratios; most software do not care if certain UI element is exactly x mm in size, but instead just care that their UI element scale matches the rest of the system.

Basically scale factor neatly encapsulates things like viewing distance, user eyesight, dexterity, and preference, different input device accuracy, and many others. It is easier to have human say how big/small they want things to be than have gazillion flags for individual attributes and then some complicated heuristics to deduce the scale.

replies(1): >>44372794 #
6. MadnessASAP ◴[] No.44372290[source]
I'm not privy to what discussions happened during the protocol development. However using scale within the protocol seems more practical to me.

Not all displays accurately report their DPI (or can, such as projectors). Not all users, such as myself, know their monitors DPI. Finally the scaling algorithm will ultimately use a scale factor, so at a protocol level that might as well be what is passed.

There is of course nothing stopping a display management widget/settings page/application from asking for DPI and then converting it to a scale factor, I just don't known of any that exist.

replies(1): >>44373536 #
7. atq2119 ◴[] No.44372515{3}[source]
Not to mention that only a small fraction of the world uses inches...
replies(2): >>44377852 #>>44388215 #
8. cycomanic ◴[] No.44372794{3}[source]
I disagree, I don't want a relative metric. You're saying scale factor neatly encapsulates viewing distance, eyesight, preference, but compared to what? Scale is meaningless if I don't have a reference point. If I have two different size monitors you have now created a metric where a scale of 2x means something completely different. So to get things look the same I either have to manually calculate DPI or I have to manually try and error until it looks right. Same thing if I change monitors, I now have to try until I get the desired scale, while if I had DPI I would not have to change a thing.

> It is easier to have human say how big/small they want things to be than have gazillion flags for individual attributes and then some complicated heuristics to deduce the scale.

I don't understand why I need gazillion flags, I just set desired DPI (instead of scale). But an absolute metric is almost always better than a relative metric, especially if the relative point is device dependent.

replies(1): >>44375273 #
9. cycomanic ◴[] No.44372836{3}[source]
I don't care about what we call the metric, I argue that a relative metric, where the reference point is device dependent is simply bad design.

I challenge you, tell a non-technical user to set two monitors (e.g. laptop and external) to display text/windows at the same size. I will guarantee you that it will take them significant amount of time moving those relative sliders around. If we had an absolute metric it would be trivial. Similarly, for people who regularly plug into different monitors, they would simply set a desired DPI and everywhere they plug into things would look the same instead of having to open the scale menu every time.

replies(2): >>44372864 #>>44388186 #
10. sho_hn ◴[] No.44372864{4}[source]
I see where you are coming from and it makes sense.

I will also say though that in the most common cases where people request mixed scale factor support from us (laptop vs. docked screen, screen vs. TV) there are also other form factor differences such as viewing distance that doesn't make folks want to match DPI, and "I want things bigger/smaller there" is difficult to respond to with "calculate what that means to you in terms of DPI".

For the case "I have two 27" monitors side-by-side and only one of them is 4K and I want things to be the same size on them" I feel like the UI offering a "Match scale" action/suggestion and then still offering a single scale slider when it sees that scenario might be a nice approach.

replies(1): >>44373523 #
11. Dylan16807 ◴[] No.44373450[source]
> We had a perfectly fitting metric in DPI, why can't I set the desired DPI for every monitor, but instead need to calculate some arbitrary scale factor?

Because certain ratios work a lot better than others, and calculating the exact DPI to get those benefits is a lot harder than estimating the scaling factor you want.

Also the scaling factor calculation is more reliable.

12. cycomanic ◴[] No.44373523{5}[source]
> I see where you are coming from and it makes sense.

I actually agree (even though I did not express that in my original post) that DPI is probably not a good "user visible" metric. However, I find that the scaling factor relative to some arbitrary value is inferior in every way. Maybe it comes the fact that we did not have proper fractional scaling support earlier, but we are now in the non-sensical situation that for the same laptop with the same display size (but different resolutions, e.g. one HiDPI one normal), you have very different UI element sizes, simply because the default is now to scale either 100% for normal displays and 200% for HiDPI. Therefore the scale doesn't really mean anything and people just end up adjusting again and again, surely that's even more confusing for non-technical users.

> I will also say though that in the most common cases where people request mixed scale factor support from us (laptop vs. docked screen, screen vs. TV) there are also other form factor differences such as viewing distance that doesn't make folks want to match DPI, and "I want things bigger/smaller there" is difficult to respond to with "calculate what that means to you in terms of DPI".

From my anecdotal evidence, most (even all) people using a laptop for work, have a the laptop next to the monitor and actually adjust scaling so that elements are similar size. Or the other extreme, they simply take the defaults and complain that one monitor makes all their text super small.

But even the people who want things bigger or smaller depending on circumstances, I would argue are better served if the scaling factor is relative to some absolute reference, not the size of the pixels on the particular monitor.

> For the case "I have two 27" monitors side-by-side and only one of them is 4K and I want things to be the same size on them" I feel like the UI offering a "Match scale" action/suggestion and then still offering a single scale slider when it sees that scenario might be a nice approach.

Considering that we now have proper fractional scaling, we should just make the scale relative to something like 96 DPI, and then have a slider to adjust. This would serve all use cases. We should not really let our designs be governed by choices we made because we could not do proper scaling previously.

replies(2): >>44377842 #>>44379712 #
13. cycomanic ◴[] No.44373536{3}[source]
As I replied to the other poster. I don't think DPI should necessarily be the exposed metric, but I do think that we should use something non device-dependent as our reference point, e.g. make 100% = 96 dpi.

I can guarantee that it is surprising to non-technical users (and a source of frustration for technical users) that the scale factor and UI element size can be completely different on two of the same laptops (just a different display resolution which is quite common). And it's also unpredictable which one will have the larger UI elements. Generally I believe UI should have behave as predictably as possible.

14. chrismorgan ◴[] No.44373915[source]
It landed four years ago, but had debilitating problems. Maybe a year ago when I last tried it, it was just as bad—no movement at all. But now, it seems largely fixed, hooray! Just toggled widget.wayland.fractional-scale.enabled and restarted, and although there are issues with windows not synchronising their scale (my screen is 1.5×; at startup, one of two windows stayed 2×; on new window, windows are briefly 2×; on factor change, sometimes chrome gets stuck at the next integer, probably the same issue), it’s all workaroundable and I can live with it.

Ahhhhhhhh… so nice.

15. meindnoch ◴[] No.44375273{4}[source]
What you actually want is not DPI (or PPI, pixels per inch) but PPD (pixels per degree). But that depends on the viewing distance.
replies(1): >>44377693 #
16. account42 ◴[] No.44377693{5}[source]
Not even that - my mom and I might sit the same distance from screens of the same size but she will want everything to be scaled larger than I do. Ultimately, it's a preference and not something that should strictly match some objective measurement.
17. account42 ◴[] No.44377842{6}[source]
The only place were this is a problem though is the configuration UI though. The display configuration could be changed to show a scale relative to the display size (so 100% on all displays means means sizes match) while the protocol keeps talking to applications in scale relative to the pixel size (so programs don't need to care about DPI and instead just have one scale factor).
18. account42 ◴[] No.44377852{4}[source]
For display (diagonal) sizes inches have become the default unit everywhere I've been to.
19. kccqzy ◴[] No.44379712{6}[source]
I find that explaining all of the above considerations to the user in a UI is hard. It's better to just let the user pick from several points on a slider for them to see for themselves.
20. Sophira ◴[] No.44384785[source]
I don't run a compositor, and with Qt6, Some programs like VirtualBox just don't respect Qt's scaling factor setting. Setting the font DPI instead results in weird bugs, like the display window getting smaller and smaller.

As it happens, VirtualBox does have its own scaling setting, but it's pretty bad, in my opinion. But I'm kind of forced to use it because Qt's own scaling just doesn't work in this case.

21. lproven ◴[] No.44388186{4}[source]
> tell a non-technical user to set two monitors (e.g. laptop and external) to display text/windows at the same size

Tell me, do you not ever use Macs?

This is not even a solved problem on macOS: there is no solution because the problem doesn't happen in the first place. The OS knows the size and the capabilities of the devices and you tell it with a slider what size of text you find comfortable. The end.

It works out the resolutions and the scaling factors. If the users needs to set that individually per device, if they can even see it, then the UI has failed: it's exposing unnecessary implementation details to users who do not need to know and should not have to care.

_Every_ user of macOS can solve this challenge because the problem is never visible. It's a question of stupidly simple arithmetic that I could do with a pocket calculator in less than a minute, so it should just happen and never show up to the user.

22. lproven ◴[] No.44388215{4}[source]
This is true, but there are a few things which just happen to be measured in this obsolete and arbitrary unit around most of the world, and pizzas and computer screens are two of the ones that can be mentioned in polite society. :-)

I speak very bad Norwegian. I use metric for everything. But once I ordered a pizza late at night in Bergen after a few beers, and they asked me how big I wanted in centimetres and it broke my decision-making process badly. I can handle Norwegian numbers and I can handle cm but not pizzas in cm.

I ended up with a vast pizza that was a ridiculous size for one, but what the hell, I was very hungry. I just left the crusts.