Or 128K of ram and 400 kb disk for that matter.
Or 128K of ram and 400 kb disk for that matter.
For example, NVIDIA GPU drivers are typically around 800M-1.5G.
That math actually goes wildly in the opposite direction for an optimization argument.
In 1985, and with 512K of RAM. It was very usable for work.
Games used either 320h or 640h resolutions, 4 bit or fake 5 bit known as HalfBrite, because it was basically 4 bit with the other 16 colors being same but half brightness. The fabled 12-bit HAM mode was also used, even in some games, even for interactive content, but it wasn't too often.
The EGA (1984) and VGA (1987) could conceivably be considered a GPU although not turning complete. EGA had 64, 128, 192, or 256K and VGA 256K.
The 8514/A (1987) was Turing complete although it had 512kB. The Image Adapter/A (1989) was far more powerful, pretty much the first modern GPU as we know them and came with 1MB expandable to 3MB.
The "high color" (16 bit) mode was 5:6:5 bits per channel, so 16 bits per pixel.
> So 153,600 bytes for the frame buffer.
And so you're looking at 614.4 KB (600 KiB) instead.
The PGC was kind of a GPU if you squint a bit. It didn't work the way a modern GPU does where you've got masses of individual compute cores working on the same problem, but it did have a processor roughly as fast as the host processor that you could offload simple drawing tasks to. It couldn't do 3D stuff like what we'd call a GPU today does, but it could do things like solid fills and lines.
In today's money the PGC cost about the same as an RTX PRO 6000, so no-one really had them.
To be frank, I wasn't aware such a mode was a thing, but it makes sense.
Btw I was a teenager when those Denthor trainers came out and I read them all, I loved them! They taught me a lot!