Most active commenters
  • jsheard(3)
  • perching_aix(3)

←back to thread

518 points LorenDB | 41 comments | | HN request time: 0.001s | source | bottom
Show context
trollbridge ◴[] No.46173936[source]
Not to disrespect this, but it used to be entirely normal to have a GUI environment on a machine with 2MB of RAM and a 40MB disk.

Or 128K of ram and 400 kb disk for that matter.

replies(10): >>46173975 #>>46174032 #>>46174138 #>>46174272 #>>46174291 #>>46174522 #>>46174810 #>>46174831 #>>46179105 #>>46179554 #
1. maccard ◴[] No.46174032[source]
A single 1920x1080 framebuffer (which is a low resolution monitor in 2025 IMO) is 2MB. Add any compositing into the mix for multi window displays and it literally doesn’t fit in memory.
replies(8): >>46174159 #>>46174187 #>>46174618 #>>46174766 #>>46176381 #>>46178650 #>>46179683 #>>46182290 #
2. echoangle ◴[] No.46174159[source]
Do you really need the framebuffer in RAM? Wouldn't that be entirely in the GPU RAM?
replies(6): >>46174217 #>>46174228 #>>46174232 #>>46174790 #>>46174992 #>>46175002 #
3. znpy ◴[] No.46174217[source]
Aren’t you cheating by having additional ram dedicated for gpu use exclusively? :)
4. sigwinch ◴[] No.46174228[source]
VGA standard supports up to 256k
5. jerrythegerbil ◴[] No.46174232[source]
To put it in GPU RAM, you need GPU drivers.

For example, NVIDIA GPU drivers are typically around 800M-1.5G.

That math actually goes wildly in the opposite direction for an optimization argument.

replies(3): >>46174310 #>>46174452 #>>46175997 #
6. Rohansi ◴[] No.46174310{3}[source]
> NVIDIA GPU drivers are typically around 800M-1.5G.

They also pack in a lot of game-specific optimizations for whatever reason. Could likely be a lot smaller without those.

replies(1): >>46174400 #
7. monocasa ◴[] No.46174400{4}[source]
Even the open source drivers without those hacks are massive. Each type of card has its own almost 100MB of firmware that runs on the card on Nvidia.
replies(1): >>46174904 #
8. jsheard ◴[] No.46174452{3}[source]
Doesn't the UEFI firmware map a GPU framebuffer into the main address space "for free" so you can easily poke raw pixels over the bus? Then again the UEFI FB is only single-buffered, so if you rely on that in lieu of full-fat GPU drivers then you'd probably want to layer some CPU framebuffers on top anyway.
replies(2): >>46174701 #>>46174783 #
9. snek_case ◴[] No.46174618[source]
I had a 386 PC with 4MB of RAM when I was a kid, and it ran Windows 3.1 with a GUI, but that also had a VGA display at 640x480, and only 16-bit color (4 bits per pixel). So 153,600 bytes for the frame buffer.
replies(2): >>46174630 #>>46178683 #
10. Dwedit ◴[] No.46174630[source]
640 * 480 / 2 = 150KB for a classic 16-color VGA screen.
11. the8472 ◴[] No.46174701{4}[source]
well, if you poke framebuffer pixels directly you might as well do scanline racing.
replies(1): >>46174815 #
12. bobmcnamara ◴[] No.46174766[source]
It's so much fun working with systems with more pixels than ram though. Manually interleaving interrupts. What joy.
replies(1): >>46181605 #
13. throwaway173738 ◴[] No.46174783{4}[source]
Yes if you have UEFI.
14. ◴[] No.46174790[source]
15. jsheard ◴[] No.46174815{5}[source]
Alas, I don't think UEFI exposes vblank/hblank interrupts so you'd just have to YOLO the timing.
16. jsheard ◴[] No.46174904{5}[source]
That's 100MB of RISC-V code, believe it or not, despite Nvidias ARM fixation.
17. maccard ◴[] No.46174992[source]
You’re assuming a discrete GPU with separate VRAM, and only supporting hardware accelerated rendering. If you have that you almost certainly have more than 2MB of ram
18. ErroneousBosh ◴[] No.46175002[source]
Computers didn't used to have GPUs back then when 150kB was a significant amount of graphics memory.
replies(1): >>46177337 #
19. hinkley ◴[] No.46175997{3}[source]
Someone last winter was asking for help with large docker images and it came about that it was for AI pipelines. The vast majority of the image was Nvidia binaries. That was wild. Horrifying, really. WTF is going on over there?
20. beagle3 ◴[] No.46176381[source]
The Amiga 500 had high res graphics (or high color graphics … but not on the same scanline), multitasking, 15 bit sound (with a lot of work - the hardware had 4 channels of 8 bit DACs but a 6-bit volume, so …)

In 1985, and with 512K of RAM. It was very usable for work.

replies(1): >>46176430 #
21. mrits ◴[] No.46176430[source]
a 320x200 6bit color depth wasn't exactly a pleasure to use. I think the games could double the res in certain mode (was it called 13h?)
replies(2): >>46176576 #>>46177038 #
22. krige ◴[] No.46176576{3}[source]
For OCS/ECS hardware 2bit HiRes - 640x256 or 640x200 depending on region - was default resolution for OS, and you could add interlacing or up color depth to 3 and 4 bit at cost of response lag; starting with OS2.0 the resolution setting was basically limited by chip memory and what your output device could actually display. I got my 1200 to display crisp 1440x550 on my LCD by just sliding screen parameters to max on default display driver.

Games used either 320h or 640h resolutions, 4 bit or fake 5 bit known as HalfBrite, because it was basically 4 bit with the other 16 colors being same but half brightness. The fabled 12-bit HAM mode was also used, even in some games, even for interactive content, but it wasn't too often.

23. teamonkey ◴[] No.46177038{3}[source]
You might be thinking of DOS mode 13h, which was VGA 320x200, 8 bits per pixel.
replies(3): >>46177480 #>>46177603 #>>46178666 #
24. trollbridge ◴[] No.46177337{3}[source]
The IBM PGC (1984) was a discrete GPU with 320kB of RAM and slightly over 64kB of ROM.

The EGA (1984) and VGA (1987) could conceivably be considered a GPU although not turning complete. EGA had 64, 128, 192, or 256K and VGA 256K.

The 8514/A (1987) was Turing complete although it had 512kB. The Image Adapter/A (1989) was far more powerful, pretty much the first modern GPU as we know them and came with 1MB expandable to 3MB.

replies(3): >>46180590 #>>46181872 #>>46191382 #
25. bananaboy ◴[] No.46177480{4}[source]
And 6-bits per colour component.
replies(1): >>46182771 #
26. globalnode ◴[] No.46177603{4}[source]
i remember playing with mode 13h, writing little graphics programs with my turboc compiler. computers were so magical back then.
27. perching_aix ◴[] No.46178650[source]
More like 6.2+ MB, or at least I'd sure hope that a FHD resolution is paired with at least a 24 bit (8 bpc) SDR color. And then there's the triple buffered vsync at play, so it's really more like 18.6+ MB.
replies(1): >>46179560 #
28. ◴[] No.46178666{4}[source]
29. perching_aix ◴[] No.46178683[source]
> and only 16-bit color (4 bits per pixel).

The "high color" (16 bit) mode was 5:6:5 bits per channel, so 16 bits per pixel.

> So 153,600 bytes for the frame buffer.

And so you're looking at 614.4 KB (600 KiB) instead.

replies(1): >>46181007 #
30. BobbyTables2 ◴[] No.46179560[source]
My video card then didn’t have video ram for 256 color SVGA…
31. SoftTalker ◴[] No.46179683[source]
Those old computers were 640x480 or so. Maybe smaller.
32. ErroneousBosh ◴[] No.46180590{4}[source]
Neither EGA or VGA were "GPUs", they were dumb framebuffers. Later VGA chipsets had rudimentary acceleration, basically just blitters - but that was a help.

The PGC was kind of a GPU if you squint a bit. It didn't work the way a modern GPU does where you've got masses of individual compute cores working on the same problem, but it did have a processor roughly as fast as the host processor that you could offload simple drawing tasks to. It couldn't do 3D stuff like what we'd call a GPU today does, but it could do things like solid fills and lines.

In today's money the PGC cost about the same as an RTX PRO 6000, so no-one really had them.

33. snek_case ◴[] No.46181007{3}[source]
"Windows 3.1 primarily used palette-based color modes, common modes included 16 colors (VGA/EGA) and 256 colors (SuperVGA)"
replies(1): >>46181122 #
34. perching_aix ◴[] No.46181122{4}[source]
Right, so 16 color, not 16 bit color.

To be frank, I wasn't aware such a mode was a thing, but it makes sense.

replies(1): >>46184560 #
35. em3rgent0rdr ◴[] No.46181605[source]
If you use a tile-based hardware renderer, such as on the original nintendo chip, then pixels are rendered on the fly to the screen by the hardware automatically pulling pixels based on the tile map.
36. Yeask ◴[] No.46181872{4}[source]
A video card is not a GPU.
37. AshamedCaptain ◴[] No.46182290[source]
You do not really need a framebuffer to drive a GUI, though. That's very much a PC thing.
38. oso2k ◴[] No.46182771{5}[source]
VGA color palette was 18-bits/256K, but input into the palette was 8-bit per channel. (63,63,63) is visibly different from (255,255,255).

http://qzx.com/pc-gpe/tut2.txt

http://qzx.com/pc-gpe/

replies(1): >>46190798 #
39. mananaysiempre ◴[] No.46184560{5}[source]
I recently installed NT4 (including Plus!) in an emulator with a VESA video driver, and was greatly surprised when about half of the icons that I thought of as “Windows 2000” (including the memorable “My Computer” one with the bulbous sky-blue screen) turned out to be available even there, provided a non-indexed mode. The rest were the more familliar 16-color-compatible 95/NT4 ones, making for an incongruous result overall. I guess what I want to say is that 16-color compatibility is a large part of the 95/NT4 look from which 2000 very carefully departed.
40. bananaboy ◴[] No.46190798{6}[source]
Sorry I'm not exactly sure what you're saying. I know very well how it works as I write a lot of demos and games (still today) for mode 13h (see https://www.pouet.net/groups.php?which=1217&order=release) and I can program the VGA DAC palette in my sleep. Were you referring to the fact that you write 8-bits to the palette registers? That's true, you do, but only 6-bits is actually used so it effectively wraps around at 64. There are 6-bits per colour component which as you pointed out is 18-bits colour depth.

Btw I was a teenager when those Denthor trainers came out and I read them all, I loved them! They taught me a lot!

41. lproven ◴[] No.46191382{4}[source]
> The 8514/A (1987) was Turing complete

WTF? Tell me more!

I have one, but I have no matching screen so I never tried it... Maybe it's worth finding a converter.