I think any type of pixel font authentic to a couple decades ago won't look good on a 4K monitor, unfortunately. It got to the point where I ordered a 1024x768 monitor just to play old games with a period system.
I think any type of pixel font authentic to a couple decades ago won't look good on a 4K monitor, unfortunately. It got to the point where I ordered a 1024x768 monitor just to play old games with a period system.
If we're talking about the subjective experience of recreating "a child's bedroom computer" from the mid 90s-early 00s, a widescreen aspect ratio alone would be jarring, since my conception of a monitor for such a system is a 4:3 CRT. So for me, little else would reach that level except a system with the same aspect ratio and a similar DPI.
Not only that, but UI design itself has undergone many shifts since that era to account for the types of monitors those UIs are being designed for. There's not as much of a need for pixel-perfect design when vector-based web UIs dominate the desktop application space nowadays, relegating those that go back to older UI paradigms to enthusiasts who still remember earlier times. Or maybe people who develop for fantasy consoles.
I should mention while I'm at it that those sort of faux-pixel art shaders used in some games come off as quite jarring to me since I expect UIs to be meticulously laid out for the original screen size, not just for blowing up arbitrary content 2x or 4x on a huge widescreen monitor. I sometimes feel those are meant to represent a nostalgic feeling of some kind, being pixelated and all, but really it just makes me wish there were some alternate reality in which people still designed games and desktop applications for 800x600 or 1024x768 monitors again.
It's interesting at present how there's stuff like 4K and then there's the playdate with a relatively tiny handheld resolution, but relatively little interest for new content for those types of resolutions in-between.
You could probably create a CRT-filter-based font for high resolution screens (though you'd probably still need to optimise for subpixel layout for accuracy, even on 4k monitors).
Yes, very early on, when people used TVs or cheap composite monitors as the display devices for their computers, there were blurry pixel edges, bloom effects, dot crawl, color artifacting, and all the rest.
But by the '90s, we had high-quality monitors designed for high-resolution graphics with fast refresh rates, with crisp pixel boundaries and minimal artifacting. CRT filters overcompensate for this a lot, and end up making SVGA-era graphics anachronistically look like they're being displayed on composite monitors.
Is that what this project is going for? I understood it to be attempting to apply design elements from that era to create a superior UI for a modern "child's bedroom computer".
People were typically using 640x480 or 800x600 in GUI enviroments, and most DOS games were at 320x200. 1600x1200 was incredibly uncommon, even where the video hardware and monitors supported it -- people were usually using 14" or 15" 4:3 displays, and that resolution was way too high to be usable on displays that size, and the necessarily lower refresh rates made flicker unbearable at higher resolutions.
At the common resolutions and with purpose-built CRT monitors, pixel boundaries were quite clear and distinguishable.
Being able to clearly resolve individual pixels (which I agree was a thing at resolutions like 640x480 or 800x600. 1024x768 is pushing it already though) is not the same as seeing "crisp" boundaries between them. The latter is what I was objecting to. 320x200 (sometimes also 320x240 or the like) is a special case since it was pixel-doubled on more modern VGA/SVGA display hardware, so that's the one case where a single pixel was genuinely seen as a small square with rather crisp boundaries, as opposed to a blurry dot.