←back to thread

518 points LorenDB | 8 comments | | HN request time: 0.224s | source | bottom
Show context
trollbridge ◴[] No.46173936[source]
Not to disrespect this, but it used to be entirely normal to have a GUI environment on a machine with 2MB of RAM and a 40MB disk.

Or 128K of ram and 400 kb disk for that matter.

replies(10): >>46173975 #>>46174032 #>>46174138 #>>46174272 #>>46174291 #>>46174522 #>>46174810 #>>46174831 #>>46179105 #>>46179554 #
maccard ◴[] No.46174032[source]
A single 1920x1080 framebuffer (which is a low resolution monitor in 2025 IMO) is 2MB. Add any compositing into the mix for multi window displays and it literally doesn’t fit in memory.
replies(8): >>46174159 #>>46174187 #>>46174618 #>>46174766 #>>46176381 #>>46178650 #>>46179683 #>>46182290 #
beagle3 ◴[] No.46176381[source]
The Amiga 500 had high res graphics (or high color graphics … but not on the same scanline), multitasking, 15 bit sound (with a lot of work - the hardware had 4 channels of 8 bit DACs but a 6-bit volume, so …)

In 1985, and with 512K of RAM. It was very usable for work.

replies(1): >>46176430 #
1. mrits ◴[] No.46176430[source]
a 320x200 6bit color depth wasn't exactly a pleasure to use. I think the games could double the res in certain mode (was it called 13h?)
replies(2): >>46176576 #>>46177038 #
2. krige ◴[] No.46176576[source]
For OCS/ECS hardware 2bit HiRes - 640x256 or 640x200 depending on region - was default resolution for OS, and you could add interlacing or up color depth to 3 and 4 bit at cost of response lag; starting with OS2.0 the resolution setting was basically limited by chip memory and what your output device could actually display. I got my 1200 to display crisp 1440x550 on my LCD by just sliding screen parameters to max on default display driver.

Games used either 320h or 640h resolutions, 4 bit or fake 5 bit known as HalfBrite, because it was basically 4 bit with the other 16 colors being same but half brightness. The fabled 12-bit HAM mode was also used, even in some games, even for interactive content, but it wasn't too often.

3. teamonkey ◴[] No.46177038[source]
You might be thinking of DOS mode 13h, which was VGA 320x200, 8 bits per pixel.
replies(3): >>46177480 #>>46177603 #>>46178666 #
4. bananaboy ◴[] No.46177480[source]
And 6-bits per colour component.
replies(1): >>46182771 #
5. globalnode ◴[] No.46177603[source]
i remember playing with mode 13h, writing little graphics programs with my turboc compiler. computers were so magical back then.
6. ◴[] No.46178666[source]
7. oso2k ◴[] No.46182771{3}[source]
VGA color palette was 18-bits/256K, but input into the palette was 8-bit per channel. (63,63,63) is visibly different from (255,255,255).

http://qzx.com/pc-gpe/tut2.txt

http://qzx.com/pc-gpe/

replies(1): >>46190798 #
8. bananaboy ◴[] No.46190798{4}[source]
Sorry I'm not exactly sure what you're saying. I know very well how it works as I write a lot of demos and games (still today) for mode 13h (see https://www.pouet.net/groups.php?which=1217&order=release) and I can program the VGA DAC palette in my sleep. Were you referring to the fact that you write 8-bits to the palette registers? That's true, you do, but only 6-bits is actually used so it effectively wraps around at 64. There are 6-bits per colour component which as you pointed out is 18-bits colour depth.

Btw I was a teenager when those Denthor trainers came out and I read them all, I loved them! They taught me a lot!