> Something went wrong here, at least once
The higher-level view of data-over-time is inherently serial, and I don't think there's anything anyone can do about it: Serial is the correct/best abstraction.
That being said, I acknowledge the problem you're referring to, because serial what exactly? Every time we complete the circle, it gets a little smaller, and our collective knowledge of what is actually going on improves.
It should make a certain amount of sense if you look at it this way: Use a serial decoder to fill a memory bank of character cells or pixels or whatever, and share that memory bank with a video encoder, and the software guys say: well, how about I just fill up that memory bank directly. Then, you don't want character cells but triangles, and you find you're just DMA'ing in that shared buffer and working around the timing of another process reading it and your ring-based memory buffer has turned back into a serial protocol before you know it.
I think the main problem is that programming languages are terrible at serial, so programmers keep trying to work around it, and then convert to/from serial at the edges.
> it's another thing to claim that this is the peak tech to power our modern CLIs, or a solid foundation for portable UIs.
I can't explain all of it, but terminal UI remain without match: Even today (2025) I use a terminal because there's no better human-computer interface for doing intellectual work.
A picture of a terminal, is not a terminal, so I understand why stuff like this confuses.