Most active commenters
  • sho_hn(11)
  • kccqzy(6)
  • account42(6)
  • pedrocr(5)
  • cycomanic(5)
  • astrange(4)
  • atq2119(4)

←back to thread

Ancient X11 scaling technology

(flak.tedunangst.com)
283 points todsacerdoti | 63 comments | | HN request time: 1.815s | source | bottom
Show context
pedrocr ◴[] No.44369891[source]
That's probably better than most scaling done on Wayland today because it's doing the rendering directly at the target resolution instead of doing the "draw at 2x scale and then scale down" dance that was popularized by OSX and copied by Linux. If you do it that way you both lose performance and get blurry output. The only corner case a compositor needs to cover is when a client is straddling two outputs. And even in that case you can render at the higher size and get perfect output in one output and the same downside in blurryness in the other, so it's still strictly better.

It's strange that Wayland didn't do it this way from the start given its philosophy of delegating most things to the clients. All you really need to do arbitrary scaling is tell apps "you're rendering to a MxN pixel buffer and as a hint the scaling factor of the output you'll be composited to is X.Y". After that the client can handle events in real coordinates and scale in the best way possible for its particular context. For a browser, PDF viewer or image processing app that can render at arbitrary resolutions not being able to do that is very frustrating if you want good quality and performance. Hopefully we'll be finally getting that in Wayland now.

replies(12): >>44370069 #>>44370123 #>>44370577 #>>44370717 #>>44370769 #>>44371423 #>>44371694 #>>44372948 #>>44373092 #>>44376209 #>>44378050 #>>44381061 #
1. kccqzy ◴[] No.44370123[source]
> doing the "draw at 2x scale and then scale down" dance that was popularized by OSX

Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale. The earliest retina MacBook Pro in 2012 for example was 2x in both width and height of the earlier non-retina MacBook Pro.

Eventually I guess the cost of the hardware made this too hard. I mean for example how many different SKUs are there for 27-inch 5K LCD panels versus 27-inch 4K ones?

But before Apple committed to integer scaling factors and then scaling down, it experimented with more traditional approaches. You can see this in earlier OS X releases such as Tiger or Leopard. The thing is, it probably took too much effort for even Apple itself to implement in its first-party apps so Apple knew there would be low adoption among third party apps. Take a look at this HiDPI rendering example in Leopard: https://cdn.arstechnica.net/wp-content/uploads/archive/revie... It was Apple's own TextEdit app and it was buggy. They did have a nice UI to change the scaling factor to be non-integral: https://superuser.com/a/13675

replies(4): >>44370977 #>>44371108 #>>44374789 #>>44375798 #
2. pedrocr ◴[] No.44370977[source]
> Originally OS X defaulted to drawing at 2x scale without any scaling down because the hardware was designed to have the right number of pixels for 2x scale.

That's an interesting related discussion. The idea that there is a physically correct 2x scale and fractional scaling is a tradeoff is not necessarily correct. First because different users will want to place the same monitor at different distances from their eyes, or have different eyesight, or a myriad other differences. So the ideal scaling factor for the same physical device depends on the user and the setup. But more importantly because having integer scaling be sharp and snapped to pixels and fractional scaling a tradeoff is mostly a software limitation. GUI toolkits can still place all ther UI at pixel boundaries even if you give them a target scaling of 1.785. They do need extra logic to do that and most can't. But in a weird twist of destiny the most used app these days is the browser and the rendering engines are designed to output at arbitrary factors natively and in most cases can't because the windowing system forces these extra transforms on them. 3D engines are another example, where they can output whatever arbitrary resolution is needed but aren't allowed to. Most games can probably get around that in some kind of fullscreen mode that bypasses the scaling.

I think we've mostly ignored these issues because computers are so fast and monitors have gotten so high resolution that the significant performance penalty (2x easily) and introduced blurryness mostly goes unnoticed.

> Take a look at this HiDPI rendering example in Leopard

That's a really cool example, thanks. At one point Ubuntu's Unity had a fake fractional scaling slider that just used integer scaling plus font size changes for the intermediate levels. That mostly works very well from the point of view of the user. Because of the current limitations in Wayland I mostly do that still manually. It works great for single monitor and can work for multiple monitors if the scaling factors work out because the font scaling is universal and not per output.

replies(2): >>44371039 #>>44371226 #
3. sho_hn ◴[] No.44371039[source]
What you want is exactly how fractional scaling works (on Wayland) in KDE Plasma and other well-behaved Wayland software: The scale factor can be something quirky like your 1.785, and the GUI code will generally make sure that things nevertheless snap to the pixel grid to avoid blurry results, as close to the requested scaling as possible. No "extra window system transforms".
replies(5): >>44371141 #>>44371886 #>>44371928 #>>44373804 #>>44380686 #
4. cosmic_cheese ◴[] No.44371108[source]
Even today you run into the occasional foreign UI toolkit app that only renders at 1x and gets scaled up. We’re probably still years out from all desktop apps handling scaling correctly.
5. pedrocr ◴[] No.44371141{3}[source]
That's what I referred to with "we'll be finally getting that in Wayland now". For many years the Wayland protocol could only communicate integer scale factors to clients. If you asked for 1.5 what the compositors did was ask all the clients to render at 2x at a suitably fake size and then scale that to the final output resolution. That's still mostly the case in what's shipping right now I believe. And even in integer scaling things like events are sent to clients in virtual coordinates instead of just going "here's your NxM buffer, all events are in those physical coordinates, all scaling is just metadata I give you to do whatever you want with". There were practical reasons to do that in the beginning for backwards compatibility but the actual direct scaling is having to be retrofitted now. I'll be really happy when I can just set 1.3 scaling in sway and have that just mean that sway tells Firefox that 1.3 is the scale factor and just gets back the final buffer that doesn't need any transformations. I haven't checked very recently but it wasn't possible not too long ago. If it is now I'll be a happy camper and need to upgrade some software versions.
replies(2): >>44371184 #>>44371338 #
6. zokier ◴[] No.44371184{4}[source]
> That's still mostly the case in what's shipping right now I believe

All major compositors support fractional scaling extension these days which allows pixel perfect rendering afaik, and I believe Qt6 and GTK4 also support it.

https://wayland.app/protocols/fractional-scale-v1#compositor...

replies(3): >>44371283 #>>44371530 #>>44384785 #
7. astrange ◴[] No.44371226[source]
> But more importantly because having integer scaling be sharp and snapped to pixels and fractional scaling a tradeoff is mostly a software limitation. GUI toolkits can still place all ther UI at pixel boundaries even if you give them a target scaling of 1.785. They do need extra logic to do that and most can't.

The reason Apple started with 2x scaling is because this turned out to not be true. Free-scaling UIs were tried for years before that and never once got to acceptable quality. Not if you want to have image assets or animations involved, or if you can't fix other people's coordinate rounding bugs.

Other platforms have much lower standards for good-looking UIs, as you can tell from eg their much worse text rendering and having all of it designed by random European programmers instead of designers.

replies(1): >>44371572 #
8. pedrocr ◴[] No.44371283{5}[source]
Seems like the support is getting there. I just checked Firefox and it has landed the code but still has it disabled by default. Most users that set 1.5x on their session are probably still getting needless scaling but hopefully that won't last too long.
replies(1): >>44373915 #
9. sho_hn ◴[] No.44371338{4}[source]
In KDE Plasma we've supported the way you like for quite some years, because Qt is a cross-platform toolkit that supported fractional on e.g. Windows already and we just went ahead and put the mechanisms in place to make use of that on Wayland.

The standardized protocols are more recent (and of course we heavily argued for them).

Regarding the way the protocol works and something having to be retrofitted, I think you are maybe a bit confused about the way the scale factor and buffer scale work on wl_output and wl_surface?

But in any case, yes, I think the happy camper days are coming for you! I also find the macOS approach attrocious, so I appreciate the sentiment.

replies(2): >>44371467 #>>44372499 #
10. pedrocr ◴[] No.44371467{5}[source]
Thanks! By retrofitting I mean having to have a new protocol with this new opt-in method where some apps will be getting integer scales and go through a transform and some apps will be getting a fractional scale and rendering directly to the output resolution. If this had worked "correctly" from the start the compositors wouldn't even need to know anything about scaling. As far as they knew the scaling metadata could have been an opaque value that they passed from the user config to the clients to figure out. I assume we're stuck forever with all compositors having to understand all this instead of just punting the problem completely to clients.

When you say you supported this for quite some years was there a custom protocol in KWin to allow clients to render directly to the fractionally scaled resolution? ~4 years ago I was frustrated by this when I benchmarked a 2x slowdown from RAW file to the same number of pixels on screen when using fractional scaling and at least in sway there wasn't a way to fix it or much appetite to implement it. It's great to see it is mostly in place now and just needs to be enabled by all the stack.

replies(1): >>44371855 #
11. cycomanic ◴[] No.44371530{5}[source]
That's great, however why do we use a "scale factor" in the first place? We had a perfectly fitting metric in DPI, why can't I set the desired DPI for every monitor, but instead need to calculate some arbitrary scale factor?

I'm generally a strong wayland proponent and believe it's a big step forward over X in many ways, but some decisions just make me scratch my head.

replies(4): >>44371804 #>>44371885 #>>44372290 #>>44373450 #
12. zozbot234 ◴[] No.44371572{3}[source]
> Free-scaling UIs were tried for years before that and never once got to acceptable quality.

The web is a free-scaling UI, which scales "responsively" in a seamless way from feature phones with tiny pixelated displays to huge TV-sized ultra high-resolution screens. It's fine.

replies(2): >>44371721 #>>44373881 #
13. astrange ◴[] No.44371721{4}[source]
That's actually a different kind of scaling. The one at issue here is closer to cmd-plus/minus on desktop browsers, or two-finger zooming on phones. It's hard to make that look good unless you only have simple flat UIs like the one on this website.

They did make another attempt at it for apps with Dynamic Type though.

replies(2): >>44372581 #>>44377973 #
14. sho_hn ◴[] No.44371804{6}[source]
The end-user UIs don't ask you to calculate anything. Typically they have a slider from 100% to, say, 400% and let you set this to something like 145%.

This may take some getting used to if you're familiar with DPI and already know the value you like, but for non-technical users it's more approachable. Not everyone knows DPI or how many dots they want to their inches.

That the 145% is 1.45 under the hood is really an implementation detail.

replies(2): >>44372515 #>>44372836 #
15. sho_hn ◴[] No.44371855{6}[source]
Oh, ok. Yeah, this I agree with, and I think plenty of people do - having integer-only scaling in the core protocol at the start was definitely a regretable oversight and is a wart on things.

> When you say you supported this for quite some years was there a custom protocol in KWin to allow clients to render directly to the fractionally scaled resolution?

Qt had a bunch of different mechanisms for how you could tell it to use a fractional scale factor, from setting an env var to doing it inside a "platform plugin" each Qt process loads at runtime (Plasma provides one), etc. We also had a custom-protocol-based mechanism (zwp_scaler_dev iirc) that basically had a set_scale with a 'fixed' instead of an 'int'. Ultimately this was all pretty Qt-specific though in practice. To get adoption outside of just our stack a standard was of course needed, I guess what we can claim though is that we were always pretty firm we wanted proper fractional and to put in the work.

16. zokier ◴[] No.44371885{6}[source]
DPI (or PPI) is an absolute measurement. Scale factor is intentionally relative. Different circumstances will want to have different scale factor : dpi ratios; most software do not care if certain UI element is exactly x mm in size, but instead just care that their UI element scale matches the rest of the system.

Basically scale factor neatly encapsulates things like viewing distance, user eyesight, dexterity, and preference, different input device accuracy, and many others. It is easier to have human say how big/small they want things to be than have gazillion flags for individual attributes and then some complicated heuristics to deduce the scale.

replies(1): >>44372794 #
17. enriquto ◴[] No.44371886{3}[source]
> The scale factor can be something quirky like your 1.785, and the GUI code will generally make sure that things nevertheless snap to the pixel grid to avoid blurry results

This is horrifying! It implies that, for some scaling factors, the lines of text of your terminal will be of different height.

Not that the alternative (pretend that characters can be placed at arbitrary sub-pixel positions) is any less horrifying. This would make all the lines in your terminal of the same height, alright, but then the same character at different lines would look different.

The bitter truth is that fractional scaling is impossible. You cannot simply scale images without blurring them. Think about an alternating pattern of white and black rows of pixels. If you try to scale it to a non-integer factor the result will be either blurry or aliased.

The good news is that fractional scaling is unnecessary. You can just use fonts of any size you want. Moreover, nowadays pixels are so small that you can simply use large bitmap fonts and they'll look sharp, clean and beautiful.

replies(2): >>44371925 #>>44372537 #
18. sho_hn ◴[] No.44371925{4}[source]
The way it works for your terminal emulator example is that it figures out what makes sense to do for a value of 1.785, e.g. rasterizing text appropriately and making sure that line heights and baselines are at sensible consistent values.
replies(1): >>44371968 #
19. 0x457 ◴[] No.44371928{3}[source]
Is it actually in Wayland or is it "implementation should handle it somehow" like most of wayland? Because what is probably 90% of wayland install base only supports communicating integer scales to clients.
replies(1): >>44371941 #
20. sho_hn ◴[] No.44371941{4}[source]
It's in Wayland in the same way everything else is, i.e. fractional scaling is now a protocol included in the standard protocol suite.

> Because what is probably 90% of wayland install base only supports communicating integer scales to clients.

As someone shipping a couple of million cars per year running Wayland, the install base is a lot bigger than you think it is :)

replies(1): >>44372129 #
21. enriquto ◴[] No.44371968{5}[source]
the problem is that there's no reasonable thing to do when the height of the terminal in pixels is not an integer multiple of the height of the font in pixels. Whatever "it" does, will be wrong.

(And when it's an integer multiple, you don't need scaling at all. You just need a font of that exact size.)

replies(1): >>44371991 #
22. sho_hn ◴[] No.44371991{6}[source]
You're overthinking things a bit and are also a bit confused about how font sizes work and what "scaling" means in a windowing system context. You are thinking taking a bunch of pixels and resampling. In the context we're talking about "scaling" means telling the software what it's expected to output and giving it an opportunity to render accordingly.

The way the terminal handles the (literal) edge case you mention is no different from any other time its window size is not a multiple of the line height: It shows empty rows of pixels at the top or bottom.

Fonts are only a "exact size" if they're bitmap-based (and when you scale bitmap fonts you are indeed in for sampling difficulties). More typical is to have a font storing vectors and rasterizing glyphs to to the needed size at runtime.

replies(1): >>44372738 #
23. 0x457 ◴[] No.44372129{5}[source]
Hmmm, sorry, but I don't care about install base of wayland in a highly controlled environment (how many different monitor panels you ship is probably less amount of displays with different DPI in my living room right now).
replies(1): >>44372148 #
24. sho_hn ◴[] No.44372148{6}[source]
90% is still nonsense even in desktop Linux, tho.
25. MadnessASAP ◴[] No.44372290{6}[source]
I'm not privy to what discussions happened during the protocol development. However using scale within the protocol seems more practical to me.

Not all displays accurately report their DPI (or can, such as projectors). Not all users, such as myself, know their monitors DPI. Finally the scaling algorithm will ultimately use a scale factor, so at a protocol level that might as well be what is passed.

There is of course nothing stopping a display management widget/settings page/application from asking for DPI and then converting it to a scale factor, I just don't known of any that exist.

replies(1): >>44373536 #
26. atq2119 ◴[] No.44372499{5}[source]
Thank you for that. The excellent fractional scaling and multi-monitor support is why I finally switched back to KDE full time (after first switching away during the KDE 3 to 4 mess).
27. atq2119 ◴[] No.44372515{7}[source]
Not to mention that only a small fraction of the world uses inches...
replies(2): >>44377852 #>>44388215 #
28. kccqzy ◴[] No.44372537{4}[source]
> The bitter truth is that fractional scaling is impossible.

That's overly prescriptive in terms of what users want. In my experience users who are used to macOS don't mind slightly blurred text. And users who are traditionalists and perhaps Windows users prefer crisper text at the expense of some height mismatches. It's all very subjective.

replies(1): >>44375868 #
29. atq2119 ◴[] No.44372581{5}[source]
I'm certain that web style scaling is what the vast majority of desktop users actually want from fractional desktop scaling.

Thinking that two finger zooming style scaling is the goal is probably the result of misguided design-centric thinking instead of user-centric thinking.

replies(2): >>44374099 #>>44383479 #
30. bscphil ◴[] No.44372738{7}[source]
Given that the context here is talking about terminals, they probably are literally thinking in terms of bitmap based rendering with integer scaling.
replies(1): >>44372840 #
31. cycomanic ◴[] No.44372794{7}[source]
I disagree, I don't want a relative metric. You're saying scale factor neatly encapsulates viewing distance, eyesight, preference, but compared to what? Scale is meaningless if I don't have a reference point. If I have two different size monitors you have now created a metric where a scale of 2x means something completely different. So to get things look the same I either have to manually calculate DPI or I have to manually try and error until it looks right. Same thing if I change monitors, I now have to try until I get the desired scale, while if I had DPI I would not have to change a thing.

> It is easier to have human say how big/small they want things to be than have gazillion flags for individual attributes and then some complicated heuristics to deduce the scale.

I don't understand why I need gazillion flags, I just set desired DPI (instead of scale). But an absolute metric is almost always better than a relative metric, especially if the relative point is device dependent.

replies(1): >>44375273 #
32. cycomanic ◴[] No.44372836{7}[source]
I don't care about what we call the metric, I argue that a relative metric, where the reference point is device dependent is simply bad design.

I challenge you, tell a non-technical user to set two monitors (e.g. laptop and external) to display text/windows at the same size. I will guarantee you that it will take them significant amount of time moving those relative sliders around. If we had an absolute metric it would be trivial. Similarly, for people who regularly plug into different monitors, they would simply set a desired DPI and everywhere they plug into things would look the same instead of having to open the scale menu every time.

replies(2): >>44372864 #>>44388186 #
33. sho_hn ◴[] No.44372840{8}[source]
Right, but most users of terminal emulators typically don't use bitmap fonts anymore and haven't for quite some time (just adding this for general clarity, I'm sure you know it).
34. sho_hn ◴[] No.44372864{8}[source]
I see where you are coming from and it makes sense.

I will also say though that in the most common cases where people request mixed scale factor support from us (laptop vs. docked screen, screen vs. TV) there are also other form factor differences such as viewing distance that doesn't make folks want to match DPI, and "I want things bigger/smaller there" is difficult to respond to with "calculate what that means to you in terms of DPI".

For the case "I have two 27" monitors side-by-side and only one of them is 4K and I want things to be the same size on them" I feel like the UI offering a "Match scale" action/suggestion and then still offering a single scale slider when it sees that scenario might be a nice approach.

replies(1): >>44373523 #
35. Dylan16807 ◴[] No.44373450{6}[source]
> We had a perfectly fitting metric in DPI, why can't I set the desired DPI for every monitor, but instead need to calculate some arbitrary scale factor?

Because certain ratios work a lot better than others, and calculating the exact DPI to get those benefits is a lot harder than estimating the scaling factor you want.

Also the scaling factor calculation is more reliable.

36. cycomanic ◴[] No.44373523{9}[source]
> I see where you are coming from and it makes sense.

I actually agree (even though I did not express that in my original post) that DPI is probably not a good "user visible" metric. However, I find that the scaling factor relative to some arbitrary value is inferior in every way. Maybe it comes the fact that we did not have proper fractional scaling support earlier, but we are now in the non-sensical situation that for the same laptop with the same display size (but different resolutions, e.g. one HiDPI one normal), you have very different UI element sizes, simply because the default is now to scale either 100% for normal displays and 200% for HiDPI. Therefore the scale doesn't really mean anything and people just end up adjusting again and again, surely that's even more confusing for non-technical users.

> I will also say though that in the most common cases where people request mixed scale factor support from us (laptop vs. docked screen, screen vs. TV) there are also other form factor differences such as viewing distance that doesn't make folks want to match DPI, and "I want things bigger/smaller there" is difficult to respond to with "calculate what that means to you in terms of DPI".

From my anecdotal evidence, most (even all) people using a laptop for work, have a the laptop next to the monitor and actually adjust scaling so that elements are similar size. Or the other extreme, they simply take the defaults and complain that one monitor makes all their text super small.

But even the people who want things bigger or smaller depending on circumstances, I would argue are better served if the scaling factor is relative to some absolute reference, not the size of the pixels on the particular monitor.

> For the case "I have two 27" monitors side-by-side and only one of them is 4K and I want things to be the same size on them" I feel like the UI offering a "Match scale" action/suggestion and then still offering a single scale slider when it sees that scenario might be a nice approach.

Considering that we now have proper fractional scaling, we should just make the scale relative to something like 96 DPI, and then have a slider to adjust. This would serve all use cases. We should not really let our designs be governed by choices we made because we could not do proper scaling previously.

replies(2): >>44377842 #>>44379712 #
37. cycomanic ◴[] No.44373536{7}[source]
As I replied to the other poster. I don't think DPI should necessarily be the exposed metric, but I do think that we should use something non device-dependent as our reference point, e.g. make 100% = 96 dpi.

I can guarantee that it is surprising to non-technical users (and a source of frustration for technical users) that the scale factor and UI element size can be completely different on two of the same laptops (just a different display resolution which is quite common). And it's also unpredictable which one will have the larger UI elements. Generally I believe UI should have behave as predictably as possible.

38. chrismorgan ◴[] No.44373804{3}[source]
> The scale factor can be something quirky like your 1.785

Actually, you can’t have exactly 1.785: the scale is a fraction with denominator 120 <https://wayland.app/protocols/fractional-scale-v1#wp_fractio...>. So you’ll have to settle for 1.783̅ or 1.7916̅.

replies(1): >>44374331 #
39. roca ◴[] No.44373881{4}[source]
You are correct. I worked on this for years at Mozilla. See https://robert.ocallahan.org/2007/02/units-patch-landed_07.h... and https://robert.ocallahan.org/2014/11/relax-scaling-user-inte... for example. Some of the problems were pretty hard but the Web ended up in a pretty good place --- Web developers pretty much don't think about whether scaling factors are fractional or not, and things just work... well enough that some people don't even know the Web is "free-scaling UI"!
replies(1): >>44377923 #
40. chrismorgan ◴[] No.44373915{6}[source]
It landed four years ago, but had debilitating problems. Maybe a year ago when I last tried it, it was just as bad—no movement at all. But now, it seems largely fixed, hooray! Just toggled widget.wayland.fractional-scale.enabled and restarted, and although there are issues with windows not synchronising their scale (my screen is 1.5×; at startup, one of two windows stayed 2×; on new window, windows are briefly 2×; on factor change, sometimes chrome gets stuck at the next integer, probably the same issue), it’s all workaroundable and I can live with it.

Ahhhhhhhh… so nice.

41. rusk ◴[] No.44374099{6}[source]
> misguided design-centric thinking

More like “let the device driver figure it out” - Apple is after all a hardware company first.

replies(1): >>44378878 #
42. sho_hn ◴[] No.44374331{4}[source]
Aye, the "like" was doing a lot of heavy lifting in that sentence intentionally :).

But it's HN, so I appreciate someone linking the actual business!

43. frizlab ◴[] No.44374789[source]
Completely unrelated but man was Aqua beautiful
44. meindnoch ◴[] No.44375273{8}[source]
What you actually want is not DPI (or PPI, pixels per inch) but PPD (pixels per degree). But that depends on the viewing distance.
replies(1): >>44377693 #
45. trinix912 ◴[] No.44375798[source]
Out of curiosity, do you happen to know why Apple thought that would be the cause for low adoption among 3rd party apps? Isn't scaling something that the OS should handle, that should be completely transparent, something that 3rd party devs can forget exists at all? Was it just that their particular implementation required apps to handle things manually?
replies(1): >>44377447 #
46. jcelerier ◴[] No.44375868{5}[source]
> In my experience users who are used to macOS don't mind slightly blurred text.

It always makes me laugh when apple users say "oh it's become of the great text rendering!"

The last time text rendering was any good on MacOS was on MacOS 9, since then it's been a blurry mess.

That said, googling for "MacOS blurry text" yields pages and pages and pages of people complaining so I am not sure it is that subjective, simply that some people don't even know how good-looking text can look even on a large 1080p monitor

replies(1): >>44377464 #
47. kccqzy ◴[] No.44377447[source]
I can only offer a hypothesis. Historically UI sizing was done in pixels, which means they are always integers. When developers support fractional scaling they can either update the app to do all calculations in floating point and store all intermediate results in floating point. That's hard. Or they could do calculations in floating point but round to integers eagerly. That results in inconsistent spacing and other layout bugs.

With 2x scaling there only needs to be points and pixels which are both integers. Developers' existing code dealing with pixels can usually be reinterpreted to mean points, with only small changes needed to convert to and from pixels.

With the 2x-and-scale-down approach the scaling is mostly done by the OS and using integer scaling makes this maximally transparent. The devs usually only need to supply higher resolution artwork for icons etc. This means developers only need to support 1x and 2x, not a continuum between 1.0 and 3.0.

48. kccqzy ◴[] No.44377464{6}[source]
You can only search for complaints because those who enjoy it are the silent majority. You can however also search for pages and pages of discussions and tools to bring Mac style text rendering to Windows including the MacType tool. It is very much subjective.

"Great text rendering" is also highly subjective mind you. To me greatness means strong adherence to the type face's original shape. It doesn't mean crispness.

49. account42 ◴[] No.44377693{9}[source]
Not even that - my mom and I might sit the same distance from screens of the same size but she will want everything to be scaled larger than I do. Ultimately, it's a preference and not something that should strictly match some objective measurement.
50. account42 ◴[] No.44377842{10}[source]
The only place were this is a problem though is the configuration UI though. The display configuration could be changed to show a scale relative to the display size (so 100% on all displays means means sizes match) while the protocol keeps talking to applications in scale relative to the pixel size (so programs don't need to care about DPI and instead just have one scale factor).
51. account42 ◴[] No.44377852{8}[source]
For display (diagonal) sizes inches have become the default unit everywhere I've been to.
52. account42 ◴[] No.44377923{5}[source]
It mostly works but you can still run into issues when you e.g. want to have an element size match the border of another. Things like that that used to work don't anymore due to the tricks needed to make fractional scaling work well enough for other uses.
replies(1): >>44379756 #
53. account42 ◴[] No.44377973{5}[source]
User scale and device scale are combined into one scale factor as far as the layout / rendering engine is concerned and thus are solved in the same way.
replies(1): >>44383482 #
54. atq2119 ◴[] No.44378878{7}[source]
In terms of how its business works, Apple is primarily a fashion company.

A deeply technical one, yes, but that's not what drives their decision making.

55. kccqzy ◴[] No.44379712{10}[source]
I find that explaining all of the above considerations to the user in a UI is hard. It's better to just let the user pick from several points on a slider for them to see for themselves.
56. kccqzy ◴[] No.44379756{6}[source]
Why wouldn't it work? The border-size accepts the same kind of length units as height or width, no?
replies(1): >>44394771 #
57. nextaccountic ◴[] No.44380686{3}[source]
What is the status of fractional pixels in GTK? Will GTK5 finally get what KDE/Qt has today?

I recall the issue is that GTK bakes deep down the fact that pixel scaling is done in integers, while in Qt they are in floats

58. astrange ◴[] No.44383479{6}[source]
At the time I was trying and failing to get Java apps to do it properly, so please send this feedback to whoever invented Swing in the 90s.
59. astrange ◴[] No.44383482{6}[source]
The difference is developers are a lot more likely to have tested one than the other. So it's what you call a binary compatibility issue.

Similarly browser developers care deeply if they break a website with the default settings, but they care less if cmd-+ breaks it because that's optional. If it became a mandatory accessibility feature somehow, now they have a problem.

60. Sophira ◴[] No.44384785{5}[source]
I don't run a compositor, and with Qt6, Some programs like VirtualBox just don't respect Qt's scaling factor setting. Setting the font DPI instead results in weird bugs, like the display window getting smaller and smaller.

As it happens, VirtualBox does have its own scaling setting, but it's pretty bad, in my opinion. But I'm kind of forced to use it because Qt's own scaling just doesn't work in this case.

61. lproven ◴[] No.44388186{8}[source]
> tell a non-technical user to set two monitors (e.g. laptop and external) to display text/windows at the same size

Tell me, do you not ever use Macs?

This is not even a solved problem on macOS: there is no solution because the problem doesn't happen in the first place. The OS knows the size and the capabilities of the devices and you tell it with a slider what size of text you find comfortable. The end.

It works out the resolutions and the scaling factors. If the users needs to set that individually per device, if they can even see it, then the UI has failed: it's exposing unnecessary implementation details to users who do not need to know and should not have to care.

_Every_ user of macOS can solve this challenge because the problem is never visible. It's a question of stupidly simple arithmetic that I could do with a pocket calculator in less than a minute, so it should just happen and never show up to the user.

62. lproven ◴[] No.44388215{8}[source]
This is true, but there are a few things which just happen to be measured in this obsolete and arbitrary unit around most of the world, and pizzas and computer screens are two of the ones that can be mentioned in polite society. :-)

I speak very bad Norwegian. I use metric for everything. But once I ordered a pizza late at night in Bergen after a few beers, and they asked me how big I wanted in centimetres and it broke my decision-making process badly. I can handle Norwegian numbers and I can handle cm but not pizzas in cm.

I ended up with a vast pizza that was a ridiculous size for one, but what the hell, I was very hungry. I just left the crusts.

63. account42 ◴[] No.44394771{7}[source]
The problem is the rounding from fractional sizes due to fractional scaling to whole pixel sizes needed to keep things looking crisp. Browsers try really hard to make sure that during this process all borders of an element remain the same size, but this also means that they end up introducing inconsistencies with other measurements.