←back to thread

331 points willm | 3 comments | | HN request time: 0.775s | source
Show context
emrah ◴[] No.41216266[source]
> The first trick is "overwrite, don't clear"

This is how games were written back in the day before DirectX was a thing. You'd write directly to the frame buffer and instead of clearing and redrawing, you'd redraw what changed and what was around and under it (because there was no time to refresh the entire view in time in addition to everything else you need to do)

replies(2): >>41217354 #>>41218952 #
1. moring ◴[] No.41218952[source]
There were at least two other techniques back then.

The first is to write to another buffer (possibly in normal RAM, not video RAM), then when the frame is done copy the whole buffer at once, so every pixel gets changed only once.

The second is to write to another buffer that must be in video RAM too, then change the registers of the graphics hardware to use that buffer to generate pixels for the monitor to show.

They had different tradeoffs. Copying the whole buffer when done was expensive, changing an address register was cheap. But the details of the register were possibly hardware-dependent, and there was no real graphics driver framework in place. Also, to just "flip buffers" (as changing the address register was called), rendering to the off-screen buffer meant sending pixels to video RAM, which was (IIRC) slower to access than normal RAM (basically a NUMA architecture), so depending on how often a pixel gets overdrawn, rendering in normal RAM could be faster overall even with the final copy taken into account.

replies(1): >>41219232 #
2. CBarkleyU ◴[] No.41219232[source]
> But the details of the register were possibly hardware-dependent, and there was no real graphics driver framework in place

Did this change with 3dfx's Glide (or subsequently Direct3D once Windows got a foothold into the gaming industry)?

replies(1): >>41220478 #
3. eropple ◴[] No.41220478[source]
It's been a while, but as I recall, DirectX introduced a hardware abstraction layer from the get-go, but support was pretty spotty for the first few years. DirectX 7 was a pretty big step forward, and coincided with a lot of pretty important 3D features like hardware T&L and vertex buffer allocation. They got there before OpenGL, so the vendors mostly oriented around D3D7 and D3D8, but vendors' implementation was still pretty wonky and bespoke. Shaders hit for D3D9, and you had to pick between HLSL and GLSL, so the gap was widening then but I think the first time you can really describe a rigorous framework for graphics drivers, as opposed to a stack of shims of varying height, would be Windows 2000/XP bringing along XDDM (which then begat WDDM in Vista; WDDM has changed over the years, but is still recognizable in Windows 11).