←back to thread

1680 points etbusch | 10 comments | | HN request time: 0.631s | source | bottom
Show context
petilon ◴[] No.31435505[source]
Still no retina display option. Steve Jobs made the right call over a decade ago... the only scaling that looks good after 100% is 200%. Any in-between scaling will have display artifacts.

This laptop has 150% scaling. What sort of display artifacts can you expect because of this? Go to a web page with a grid, with 1-pixel horizontal grid lines. Even though all lines are set to 1-pixel, some lines will appear thicker than others.

I blame Microsoft for this mess. Windows supports in-between resolutions (with display artifacts), and hardware manufacturers therefore manufacture in-between resolutions. Framework laptop is limited to what the display manufacturers put out.

replies(9): >>31435534 #>>31435544 #>>31435704 #>>31435840 #>>31435937 #>>31436188 #>>31436195 #>>31436260 #>>31436741 #
1. pkulak ◴[] No.31435704[source]
It's possible for an OS to support fractional scaling properly; just tell applications to render their windows 1.5 times larger, map the inputs properly, and turn off font anti-aliasing. The problem is that it requires every app to be updated, which hasn't happened everywhere yet. Android and iOS, for example, do it perfectly. So does ChromeOS.
replies(3): >>31435915 #>>31435961 #>>31437618 #
2. arinlen ◴[] No.31435915[source]
> (...) just tell applications to render their windows 1.5 times larger, map the inputs properly, and turn off font anti-aliasing.

Doesn't disabling anti-aliasing make things look worse? Unintentional and random jagged lines never look right.

replies(2): >>31436278 #>>31438473 #
3. favadi ◴[] No.31435961[source]
Sadly, Linux doesn't support fractional scaling properly. This is a show stopper for me.
replies(1): >>31438065 #
4. adgjlsfhk1 ◴[] No.31436278[source]
anti-aliasing matters a lot less when you have a high resolution display.
replies(1): >>31436594 #
5. arinlen ◴[] No.31436594{3}[source]
> anti-aliasing matters a lot less when you have a high resolution display.

The original claim is that turning off anti-aliasing would make things look better, and not that it looks bad but not that bad.

Even in high res displays, isn't it true that anti-aliasing makes things look better?

replies(2): >>31438530 #>>31440110 #
6. tadfisher ◴[] No.31437618[source]
Even Android maps "1dp" to a non-integer number of pixels on most displays.

It looks "perfect" because of a combination of anti-aliasing and high density. But zoom in on a repeating pattern of 1dp lines, and you will see that some are aliased and some are not if your display's density is not an integer multiple of 160dpi (mdpi).

But Android can do this everywhere because everything draws to a Skia canvas under the hood (well, HWComposer/SurfaceFlinger, but basically Skia). Desktop operating systems don't have the same luxury. MacOS and Gnome render at 2x and downscale the entire frame, which produces decent results on high-density displays but look blurry otherwise. I have no idea what Windows does but it sounds like it's a mess.

7. imilk ◴[] No.31438065[source]
Fractional scaling works pretty seamlessly for me using Pop OS on a Framework.
8. OctopusLupid ◴[] No.31438473[source]
Is it possible GP was talking about sub-pixel font anti-aliasing, which would look wrong when scaled?
9. adgjlsfhk1 ◴[] No.31438530{4}[source]
Anti-aliasing at the wrong resolution looks worse than not anti-aliasing at all. As such, if you tell your applications to render things larger than 1x scaling, anti-aliasing starts to hurt more than it helps.
10. throwaway92394 ◴[] No.31440110{4}[source]
Yes and no. Speaking generally about anti-aliasing, and the method it's done varies a lot in it's trade offs.

Generally anti-aliasing is a trade off between pixelation, blurriness, and performance. The better the anti-aliasing and the higher the pixel count the slower the performance - this can be an issue and some GUI applications like some IDE's at high DPI's. Faster antialiasing methods will look worse.

In an ideal world a high enough pixel density would mean the apparent pixelation is so low that anti-aliasing isn't necessary. Generally anti-aliasing means more blurry - although the amount of blur might not be an issue for you, it depends. The higher the DPI the less pixels that need to be "guessed" which gives you better precision, which is especially useful for vector graphics like text that have theoretically infinite precision.

It really depends on how you define "better". Generally for text specifically I think most people prefer sharpness. This, combined with the much higher DPI display's we have nowadays I think we're at the point where for many people including myself, text looks better without antialiasing. Personally I think it's easier to read.

tl;dr - it depends on how you define "better". At very high DPI's I think we're at a point where many people prefer the sharpness provided by the lack of AA compared to the artifacts that are now relatively tiny thanks to the high DPI. Also in some applications like Intellij I also have had performance issues with AA at high DPI's.