←back to thread

358 points ofalkaed | 2 comments | | HN request time: 0.61s | source

Just curious and who knows, maybe someone will adopt it or develop something new based on its ideas.
Show context
Animats ◴[] No.45556053[source]
- Photon, the graphical interface for QNX. Oriented more towards real time (widgets included gauges) but good enough to support two different web browsers. No delays. This was a real time operating system.

- MacOS 8. Not the Linux thing, but Copeland. This was a modernized version of the original MacOS, continuing the tradition of no command line. Not having a command line forces everyone to get their act together about how to install and configure things. Probably would have eased the tradition to mobile. A version was actually shipped to developers, but it had to be covered up to justify the bailout of Next by Apple to get Steve Jobs.

- Transaction processing operating systems. The first one was IBM's Customer Information Control System. A transaction processor is a kind of OS where everything is like a CGI program - load program, do something, exit program. Unix and Linux are, underneath, terminal oriented time sharing systems.

- IBM MicroChannel. Early minicomputer and microcomputer designers thought "bus", where peripherals can talk to memory and peripherals look like memory to the CPU. Mainframes, though, had "channels", simple processors which connected peripherals to the CPU. Channels could run simple channel programs, and managed device access to memory. IBM tried to introduce that with the PS2, but they made it proprietary and that failed in the marketplace. Today, everything has something like channels, but they're not a unified interface concept that simplifies the OS.

- CPUs that really hypervise properly. That is, virtual execution environments look just like real ones. IBM did that in VM, and it worked well because channels are a good abstraction for both a real machine and a VM. Storing into device registers to make things happen is not. x86 has added several layers below the "real machine" layer, and they're all hacks.

- The Motorola 680x0 series. Should have been the foundation of the microcomputer era, but it took way too long to get the MMU out the door. The original 68000 came out in 1978, but then Motorola fell behind.

- Modula. Modula 2 and 3 were reasonably good languages. Oberon was a flop. DEC was into Modula, but Modula went down with DEC.

- XHTML. Have you ever read the parsing rules for HTML 5, where the semantics for bad HTML were formalized? Browsers should just punt at the first error, display an error message, and render the rest of the page in Times Roman. Would it kill people to have to close their tags properly?

- Word Lens. Look at the world through your phone, and text is translated, standalone, on the device. No Internet connection required. Killed by Google in favor of hosted Google Translate.

replies(21): >>45556101 #>>45556208 #>>45556229 #>>45556271 #>>45556669 #>>45556854 #>>45556998 #>>45557079 #>>45557129 #>>45557453 #>>45557719 #>>45557765 #>>45558292 #>>45558373 #>>45558638 #>>45558754 #>>45559205 #>>45560537 #>>45561793 #>>45565845 #>>45582965 #
bazoom42 ◴[] No.45556271[source]
> Would it kill people to have to close their tags properly?

Probably not, but what would be the benefit of having more pages fail to render? If xhtml had been coupled with some cool features which only worked in xhtml mode, it might have become successful, but on its own it does not provide much value.

replies(3): >>45556434 #>>45556459 #>>45560560 #
defanor ◴[] No.45556434[source]
> but what would be the benefit of having more pages fail to render?

I think those benefits are quite similar to having more programs failing to run (due to static and strong typing, other static analysis, and/or elimination of undefined behavior, for instance), or more data failing to be read (due to integrity checks and simply strict parsing): as a user, you get documents closer to valid ones (at least in the rough format), if anything at all, and additionally that discourages developers from shipping a mess. Then parsers (not just those in viewers, but anything that does processing) have a better chance to read and interpret those documents consistently, so even more things work predictably.

replies(2): >>45556538 #>>45557186 #
bazoom42 ◴[] No.45556538[source]
Sure, authoring tools should help authors avoid mistakes and produce valid content. But the browser is a tool for the consumer of content, and there is no benefit for the user if it fails to to render some existing pages.

It is like Windows jumping through hoops to support backwards compatibility even with buggy software. The interest of the customer is that the software runs.

replies(3): >>45557143 #>>45557222 #>>45557331 #
1. lucketone ◴[] No.45557222[source]
if developer accidentally left opening comment at the start of the html.

Rhetorical question: Should the browser display page even if it is commented out?

There is some bar for what is expected to work.

If all browsers would consistently error out on unclosed tags, then it would definitely force developers to close tags, it would force it become common knowledge, second nature.

replies(1): >>45566647 #
2. bazoom42 ◴[] No.45566647[source]
As for your “rhetorical” question, you can find the answer in the HTML spec.