←back to thread

567 points elvis70 | 1 comments | | HN request time: 0s | source
Show context
doright ◴[] No.43525376[source]
I like themes like this. The only thing that hampers the authenticity for me, and this isn't the fault of the author really, is the super high resolution fonts compared to what was available back then. There's just something charming about low resolution fonts that are readable enough on screen, probably nostalgia.

I think any type of pixel font authentic to a couple decades ago won't look good on a 4K monitor, unfortunately. It got to the point where I ordered a 1024x768 monitor just to play old games with a period system.

replies(6): >>43525509 #>>43525691 #>>43525925 #>>43528393 #>>43528861 #>>43531812 #
jeroenhd ◴[] No.43528393[source]
Pixel fonts don't accurately represent the 90's UIs because we don't use CRTs anymore. The poor souls buying the very first terrible flat screen monitors may have used computers like that, but most of that era was experienced using smudgy, edge blurring CRTs.

You could probably create a CRT-filter-based font for high resolution screens (though you'd probably still need to optimise for subpixel layout for accuracy, even on 4k monitors).

replies(1): >>43534245 #
Gormo ◴[] No.43534245[source]
Most people vastly overstate the effect that CRT displays had on the appearance of legacy software.

Yes, very early on, when people used TVs or cheap composite monitors as the display devices for their computers, there were blurry pixel edges, bloom effects, dot crawl, color artifacting, and all the rest.

But by the '90s, we had high-quality monitors designed for high-resolution graphics with fast refresh rates, with crisp pixel boundaries and minimal artifacting. CRT filters overcompensate for this a lot, and end up making SVGA-era graphics anachronistically look like they're being displayed on composite monitors.

replies(1): >>43536237 #
zozbot234 ◴[] No.43536237[source]
CRT monitors did not have "crisp pixel boundaries". A CRT pixel is a Gaussian-blurred dot, not a "crisp" square as it is on modern displays. What "high-quality" CRT monitors did have was higher resolutions, even as high as 1600x1200, where individual pixels are basically not distinguishable.
replies(1): >>43546237 #
Gormo ◴[] No.43546237{3}[source]
By the early '90s, high-quality CRT displays had low dot pitches or very precise aperture grilles in addition to supporting a wider range of refresh rates, and better clarity of display was a major selling point.

People were typically using 640x480 or 800x600 in GUI enviroments, and most DOS games were at 320x200. 1600x1200 was incredibly uncommon, even where the video hardware and monitors supported it -- people were usually using 14" or 15" 4:3 displays, and that resolution was way too high to be usable on displays that size, and the necessarily lower refresh rates made flicker unbearable at higher resolutions.

At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.

replies(1): >>43546917 #
1. zozbot234 ◴[] No.43546917{4}[source]
> At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.

Being able to clearly resolve individual pixels (which I agree was a thing at resolutions like 640x480 or 800x600. 1024x768 is pushing it already though) is not the same as seeing "crisp" boundaries between them. The latter is what I was objecting to. 320x200 (sometimes also 320x240 or the like) is a special case since it was pixel-doubled on more modern VGA/SVGA display hardware, so that's the one case where a single pixel was genuinely seen as a small square with rather crisp boundaries, as opposed to a blurry dot.