Wow, and that ruler on the right side, even with the sound.
One of the nicest pages I have been on.
And the landing page... https://www.makingsoftware.com/
It just keeps on giving.
What is astonishing about LCDs? I don't mean to diminish the difficulty of scaling up the process, but if you think of early LCD displays they don't seem farfetched to be shipped to consumers.
I read something interesting recent but I'm not sure if it's true or not. That as you age your integration frame rate decreases.
It's all engineering but it's surprisingly hard to move things from the lab to manufacturing at scale. Years and years and lots of problem solving. Some efforts/approaches fail and you never hear of them.
So yes, any image was extremely ephemeral at the time.
PS: Apparently it’s called a Noddy, it’s a video camera controlled by a servomotor to pan and tilt (or 'nod', hence the name Noddy): https://en.wikipedia.org/wiki/Noddy_(camera)
The problem in that video is that the exact location the beam is hitting is momentarily very bright, so they calibrated the exposure to that and everything else looks really dark.
In a sense, all vision is.
Phosphorescent blue OLEDs should reduce current OLED display energy usage by 20-30%. But it still seems to be way off for phones and mass usage.
A CRT - to name one - is a device whose actual understanding will challenge people in profound ways. To ask “how does a screen even work?” and to begin to answer this question will require a bit more than a summary form of “thing goes from point A to point B”. The history of this discovery is a stack of books and in and of itself is fascinating - the experiments and expectations and failures and theories as to why and how. I suppose I just expect more of the site. The illustrations are nice. Oh and my moniker is just a coincidence.
[0] https://antiqueradio.org/art/RCACTC-11ConvergBoardNewRC.jpg
> How do you make the illustrations?
> By hand, in Figma. There's no secret - it's as complicated as it look
But OLEDs just have too many advantages where it actually matters. Much lower power consumption, physically more compact (no need for backlight layers), etc.
The exact sizes, shapes, and positions of the pigment dot triples (and/or the mask holes) are presumably chosen so that this holds even away from the main axis. Also, the shape of the deflecting field is probably tuned to keep the rays as well-focused as possible. Similarly to how photographic lenses are carefully designed to minimize aberrations and softness even far from the optical axis.
(*) Simplifying a bit by assuming that the beam gets deflected immediately as it leaves the gun, which is of course inaccurate.
The first LCD products I remember were things like 7 segment digital watches and calculators where the LCD was passive and the "pixels" were large. I am not super familiar with how that went from lab to consumer product but I imagine even there it was non-trivial.
It took a long time to progress to modern LCD displays. It took years to get from small black and white displays, to small color, to larger and larger displays. Productizing this stuff includes building machines, factories, ASICs, and figuring out a lot of technology as you go along.
Some interesting history here: https://www.varjukass.ee/Kooli_asjad/Ylikool/telekom/displei...
Even "digital RGB" isn't digital in terms of the CRT. It's only "digital" because each color channel has a nominal on and off voltage, with no in-between (outside of the separate intensity pin). However, the electron gun still has a rise and fall time that is not instant.
Displays didn't truly become digital for the masses until the LCD era, with DVI and HDMI signals. Even analog HD CRTs could accept these digital signals and display them.
I was thrilled when my computer let me choose a resolution of 848x480, and it worked perfectly.
Back in those days, the web was usable at that resolution.
Each individual pixel is driven by a transistor and capacitor that actively maintain the pixel state? Insane manufacturing magic.
Dead pixels used to be a big problem with LCD displays. Haven’t thought about that in at least twenty years.
[0] https://blurbusters.com/wp-content/uploads/2018/01/crt-phosp...
[1] https://www.researchgate.net/figure/Phosphor-persistence-of-...
[2] https://www.researchgate.net/figure/Stimulus-succession-on-C...
As a result monochrome terminal text has this surprising sharpness to it.(surprising if you are used to color displays). But the real visual treat are the long persistence phosphor radar scopes.
https://i.sstatic.net/5K61i.png
The brightly-lit band is the part of the frame scanned by the beam while the shutter was open. The part above is the afterimage, which, while not as bright, is definitely there.
Even apart from that, a lot of laptops still have 1280x800 as the default resolution, and that's only double the width of 640x480. Honestly, I'd actually be more worried about OS and browser chrome eating up the space than websites themselves being unusable.
Try browsing on your phone in landscape mode.
I believe that their point wasn't that "the web" has intrinsically changed, it was that too many sites are not well designed in this respect.
edit: they actually replied just before me and it seems that wasn't their point, but it would be my point (though I personally don't care about being able to use such a low resolution).
Color composite video, as far as I understand, does have a limit to the horizontal resolution because in all three standards the color information is encoded as a high-frequency signal added to the main (luminance) one, so that frequency is your upper limit on how quickly the luminance can change.
S-video, VGA, and component should, in theory, allow infinite horizontal resolution and color.
Genuine question: why do you think CRTs are better?
> Genuine question: why do you think CRTs are better?
CRTs are worse in most aspects than modern displays, but they are better in motion clarity. As to why I think that: I used both in parallel for many years. The experience for moving objects is very different. It is a well-known drawback of sample-and-hold display technologies. And it is supported by the more systematic analyses done by the likes of Blur Busters.
Wonderful content and website otherwise!
> modern displays don't paint the image line-by-line (...) They light up each pixel simultaneously, refreshing the entire display at once.
The entire screen area is lit all the time now, yes, but refresh still typically happens line by line, top to bottom [0], left to right [0], for both LCDs and OLEDs. It's a scanning refresh, not a global refresh (sadly).
You can experimentally confirm this using a typical smartphone. Assuming a 60 Hz screen refresh, recording in slow motion will give you enough extra frames that the smartphone camera also likely operating in a scanning fashion (rolling shutter) won't impact the experiment. On the recording, you should see your screen refreshing in the aforementioned fashion.
[0] actual refresh direction depends on the display, this is for a typical desktop monitor
But you're right both LCD and OLED refresh a stored voltage on the cell (or caps) on a roughly line by line (OLED can easily be 5 clocks on the GIP to cancel internal transistor offset voltages).
I was mostly annoyed that they didn't mention the circular polarizer on OLEDs. Although there is discussion of going to color filters with Quantum Dot OLED, the circular polarizer is what makes the blacks so black on mobile OLED devices.
Also, didn't really mention pentile RGGB sub-pixel pattern which is dominant in mobile OLED (which is more than 50% of devices). Now they're moving to "tandem" stacked OLED for higher brightness and lower current density, but no latteral sub-pixel pattern.
Regarding CRTs, at the vector CRTs section, they mention "they were mostly monochrome and so the phosphor dots could be tightly packed" - this is not true either I believe, monochrome CRTs had a uniform phosphor coat on the inside, no subpixel patches. I'd have also liked if they delved a bit into the decay times of the various phosphor chemistries used for color CRTs, and how they compare to LCDs and OLEDs. It's an entertaining comparison, grounds motion performance related discussions really well.
Regarding LCDs, I missed the mention of multi-layer LCDs, especially since they bring up tandem OLEDs.
Regarding OLEDs, now that you mention, the subpixel layouts were left unaddressed.
Regarding quantum dots, I missed both the mention of QDEL as a somewhat promising future contender, and the mentioning of the drawback of their typical implementation. External light also provides them with energy to activate, which I believe is at least partially the cause behind the relatively poor black levels of QD-OLEDs in environments with significant ambient light (+ something about it not being possible to put a polarizer in front of them?)
I was also generally expecting a more in-depth look by the title, would have loved to learn about the driving electronics, maybe learn about why OLEDs aren't ran anywhere near as fast as their full potential (I'd assume throughput limitations), etc. Overall, it basically only covers as much as my own enthusiast but not in-the-area self gathered over the years too.
This is one of the reasons why emulated versions of Asteroids (arcade game) can never match the real thing: the razor-sharp, perfectly straight lines with zero aliasing used to paint the display. The computer also has fine-grained control of how bright to make the electron beam that raster displays typically don't allow (this is perhaps as simple as holding the beam in place, or drawing back and forth over the same line segment), meaning that your ship's projectiles and enemy shots appear as super-bright points with a phosphor bloom around them, glittering in the dark. Most emulators simply draw them as nondescript pixels. I suppose with some effort a CRT simulator can be hooked up to the emulator... but it still wouldn't be the same.
I'm glad I got to play an authentic Asteroids before I died. Working machines are getting rarer. Some of those who come after me may not get that chance.
YES, OLEDs consume less power, offer truer color reproduction, and are physically more compact.
BUT, they are prone to CRT-like burn-in.
SSDs, the same thing.
YES, SSDs are much faster and immune to mechanical failure.
BUT, they tend not to last as long as HDDs due to limited write cycles, and their price per GiB is still much higher.
Monochromatic CRTs were well and truly resolution agnostic, there were legitimately no pixels or subpixels or anything similar to speak of. That said, the driving signal still had to be modulated to produce an image, and so it's not magic either. You can conceivably represent [0] all the available information in them using just 720 samples per line, which is exactly why DVDs had that as their horizontal resolution (720 pixels).
This story changes a bit though with color CRTs, where you did have discrete sets of patches of different phosphor chemistries called triads. There was absolutely a fixed number of them on a glass, so you could conceivably consider that as the native resolution for that given display, with each triad being a pixel, and each patch being a subpixel. The distance between these was the aperture pitch, much like how you have a pixel pitch on a typical flatpanel display.
The kicker then is that as you say, there's no strict addressing. From what I understand there were multiple electron guns scanning across the screen simultaneously, only being able to hit the specific color they were assigned, but the patch they were hitting wasn't addressed, they just scanned across the screen like the single electron gun did in monochromatic CRTs. You'd then get resolution invariance by just the natural emission spread providing you with oversampling / undersampling without any kind of digital computational effort. It's not really true resolution independence like with the monochrome ones, I'd say. I even recall articles where they were testing freshly released CRT monitors, and discussing how sharp the beam was, resulting in what kind of resolution adherence.
[0] an earlier version of this comment said "extract from" here; for various reasons you might already know, that's a different thing, and would not actually be true.
Still, that vector CRT that I saw perhaps a dozen years ago was quite a surprise. Lack of rastering and the utterly insane brightness sent me down a rabbit hole. I ultimately concluded I'm not ever likely to own a basement Asteroids cabinet.
Not necessarily. For example on VR headsets the LCD/OLED will only hold the picture for 10% of the frame.
The Noddy was used since it was a live broadcast and “allowed the idents to be of no fixed length as the clock symbols could continue for many minutes at a time”.
So, it’s not really because they couldn’t store video. It’s because they needed an indefinite amount of video for the clock idents and couldn’t generate them digitally.
The difference between LCDs and CRTs in this regard then, is that on a CRT you only ever got light during that chase section. The initial state is full darkness, and the final state is full darkness too. It's a pulse.
The input is roughly serial, so it takes a massive serial to parallel conversion.
DVI (and thus older HDMI) being essentially "VGA that skipped Digital to Analog conversion" you're riding the beam, including porches.
They have many disadvantages, but an advantage is that CRTs mostly remove the "persistence blur" induced by smooth pursuit eye movements on sample-and-hold displays like LCD and OLED. Here is an explanation:
Look at the connector pinout of the panel itself. There's only 50 pins or so, and a lot of them are grounds. Whether the scaler-to-panel format is eDP, or LVDS (FPD-Link), or V-by-One, it's all still differential serial lanes at that point.
Around the perimeter of the panel, then, are the actual TCON and row/column driver chips, bonded right to the ITO traces on the glass, flip-chip-on-glass style. These have an outrageous number of pins, and directly connect to the gate (row) and source (column) traces. It's here that the serial becomes parallel, and the next stop is the transistors themselves (hence the MOSFET signal terminology of gate and source) in the individual pixels.
Older displays would have basically a bunch of serial-to-parallel register chips with each one's SO connected to the next one's SI. I ran across a fasincating video of replacing a bad chip in such a display, which happens to be gas plasma so the voltages involved are also pretty high too:
One likely problem for battery powered headsets is the (I believe) relatively high CRT power draw. Another is probably the fact that they aren't used for anything else anymore, meaning CRT development has stopped a long time ago. There were quite small CRTs in the past for special applications, but probably not as small as is optimal for modern VR headsets. Both for optics and weight and space reasons.
Yes it's there, but it's much less bright than the the scanned area, so it will be hardly perceptible relative to the bright part. The receptors in the eye will hardly respond to it after being excited so strongly by the bright part.
The only annoying thing is every couple hours it asks me to run a 7 minute pixel refresh cycle to avoid burn in, but according to the dashboard I run it every 2.5 hours or so when I go on breaks, so I think I’m good.
Overall the monitor is just fantastic, my LAN party buddies and I dreamed about OLEDs like this back in 2003 and kept saying it was “just around the corner”. The biggest thing is in dark scenes in games there’s absolutely zero noticeable smearing.
[0] https://www.microcenter.com/product/689939/asus-pg27ucdm-265...
It's phosphor chemistry dependent. Different color patches on the same glass would decay at different rates even. But yeah, 1 ms is a good lower bound, although when I last researched this, it was definitely the best case scenario for CRTs. I'm fairly sure the ~500 Hz OLEDs that are already floating around are beating the more typical CRTs of old already.
> That’s why you would need a 1000 Hz LCD/OLED screen with really high brightness (and strobing logic) to approximate CRT motion clarity.
At 1000 Hz you wouldn't need the strobing anymore (I believe?), that's the whole point of going that fast. We're kinda getting there btw! Hopefully with HDMI 2.2 out, we'll see something cool.
> On a traditional NTSC/PAL CRT, 1 ms is just under 16 lines, but the latest line is already much brighter than the rest.
That doesn't really math for me. NTSC would be 480 visible lines at 60 Hz, and so 480 lines / ~16.6 ms = 28.8 lines/ms (6% of the screen). Note that of course PAL works out to the same number: 576 lines / 20 ms = 28.8 lines/ms (just 5% of the screen here though!).
In contrast, pointing a TV camera at a spinning globe was much easier. And for showing the time, pointing at a physical clock was much easier than, what, having twelve hours of film footage available and having to synch the right frame?
I think what’s maybe more surprising for people than that moving station idents were typically in camera props, is that broadcasting even a static image pre-digital was also much more easily accomplished by just pointing a camera at a piece of card - even repeating a single frame over and over again was not something that could be easily reproduced some other way; having a camera continually capture and immediately broadcast the frame was just much easier.
Video tape, once it came in, allowed freeze frames but continually reading from the same spot on a tape caused wear so you couldn’t rely on being able to show a single frame from tape indefinitely.
Digital freeze frame machines that could capture a frame of video and repeatedly play it back from a memory buffer only started showing up in the 1980s.
Everyone knows the obvious reasons we don't use crts any more.
It's true but it's a most uninteresting observation that only cares about practical aspects. Practical aspects matter, none of my own desks has a crt, but they do not define life itself.
Those facts do not at all invalidate the point about the desirable aspects which have been lost, or the fact that the merely interesting and remarkable aspects are interesting and remarkable.
The desirable and/or interesting and remarkable features of a crt are still cool, impressive, fun, desirable, even though we all voluntarily choose to use something else basically everywhere we want a screen because of the practical reasons that just happen to overwhelm.
Oh yes! The bright projectiles really add to the game impact. And on some cabinets, the lines were not perfectly straight. It looked for all the world like the phosphor coating had a bit of texture to it. Now being older, I realize an effect like that could just be a marginal DAC too.
IMHO the best vector experiences, in order are:
STAR WARS
This is a color vector display cranked up to the nines! The processor handling the vector drawing is fast! Tons of vectors are possible with only subtle impact on display refresh speed and overall quality. There is some global image size artifacts that happen when some of the brightest objects occupy a significant percentage of the display.
And that is a feature! Love it. Get into a sit down cabinet if you ever get the chance.
TEMPEST
This game is not for everyone. Most of these drive people to their limits, but TEMPEST ramps up and beyond normal human limits! Not everyone can play this game at its peak. Same can be said of nearly everything on this list, but without that aggressive ramp up.
ASTEROIDS
I prefer the original cabinet with the somewhat slower object motion. That one is a bit easier to play. Depending on the operator and how hard they drive the CRT, image brightness ranges from a bit old, washed out and tired looking to WOW! How do those tiny projectiles not just carve a line right into the phosphors.
Cinematronics games: TAIL GUNNER, STAR CASTLE, RIP OFF.
These use overlays for a bit of color. Oh, I forgot ARMOR ATTACK, which uses large ones like STAR CASTLE.
The quality of the vectors is not quite as good as the ATARI displays and this too is a feature. That gives Cinematronics a bit of charm I find quite enjoyable
And sound! Hoo boy! STAR CASTLE has great, loud --> I mean loud sounds with full bass notes able to rumble you and the cabinet!
OTHER COLOR VECTOR GAMES
I like playing all of these, but they simply were not peak experiences. Still damn good, if you ask me:
MAJOR HAVOC, GRAVITAR, A Two Player tandem Asteroids game I cannot recall. Fun though!
And last place: QUANTUM played with the Trackball. You circle atoms over and over. This game looks cool and is hard.
GRAVITAR uses the Asteroids movement dynamics to great effect! A fun thing in this game is massive changes in scale happen often. Rare to see.
Vector gaming delivered many of my very highly cherished arcade gaming experiences for sure.
ATARI and Tektronix deserve special mention in this context:
Atari made color vector games work! Did anyone else? Those look amazing! And hold up today in my view.
Tektronix invented both a pure storage tube CRT. Their graphics terminals often doubled as Minicomputers programmable in Tek Basic. The large ones offered a 4K vector space! Crazy good detail for the 70's. And one in good condition, operating in a reduced light room is beautiful to use.
My first manufacturing CAM software experience was on one of these. Used a fixed record length cassette so that many "files" could be accessed almost like a floppy disk drive. User data went right to the paper tape puncher / reader. 1200 baud punch, reads could be faster, up to 9600, if one had a good reader unit.
One ran applications from that cassette and stored and used user data from the paper tape.
But I digress!
Right near the end, Tek managed to get both storage graphics and dynamic refresh capable graphics, both in a different color. I only got to use one of those one time. I loved it because many different work flows were possible.
Man, for the chance to code a UI on one today!
One last thought: in my view vector displays are best on a CRT, mostly because of the image contrast and speed possible, but great vector experiences can also be had on a wall, or perhaps a screen with some coating to bring out the best possible.
We may yet see vectors appear from time to time in these and other ways simply because of how great they are. Hope so, and building a small, color capable one using a low power laser and screen with coatings sure to deliver motion trails is on ky bucket list.
Given that, all things equal there is no way for LCD to equal the efficiency of a self emissive display, at best it's a question of when will the luminous efficiency of OLED exceed that of white/blue backlight LEDs... and honestly we're likely already at or past that point.
The backlight LEDs are just much more efficient than OLEDs. The power consumption of TVs / monitors is a well known quantity.
The problem of a finite dot pitch interfering with image quality, especially on small displays where the dots were necessarily larger relative to the image size, is what motivated Tektronix to develop field-sequential color CRTs which they used in their digital oscilloscopes in the 80s and 90s. JVC also used the technology in some professional broadcast video monitors. Basically it was a B&W CRT with a changeable (liquid crystal) color filter in front of it. The R, G, and B channels would be shown one after another with the corresponding filter activated, in a similar manner to a color wheel DLP projector.