←back to thread

617 points EvgeniyZh | 2 comments | | HN request time: 0.001s | source
Show context
zabzonk ◴[] No.43576999[source]
I've written an Intel 8080 emulator that was portable between Dec10/VAX/IBM VM CMS. That was easy - the 8080 can be done quite simply with a 256 value switch - I did mine in FORTRAN77.

Writing a BASIC interpreter, with floating point, is much harder. Gates, Allen and other collaborators BASIC was pretty damned good.

replies(4): >>43577257 #>>43577890 #>>43579471 #>>43580146 #
teleforce ◴[] No.43580146[source]
Fun facts, according to Jobs for some unknown reasons Wozniak refused to add floating point support to Apple Basic thus they had to license BASIC with floating point numbers from Microsoft [1].

[1] Bill & Steve (Jobs!) reminisce about floating point BASIC:

https://devblogs.microsoft.com/vbteam/bill-steve-jobs-remini...

replies(2): >>43580215 #>>43585252 #
WalterBright ◴[] No.43585252[source]
Writing a floating point emulator (I've done it) is not too hard. First, write it in a high level language, and debug the algorithm. Then hand-assembling it is not hard.

What is hard is skipping the high level language step, and trying to do it in assembler in one step.

replies(3): >>43585361 #>>43586743 #>>43589235 #
kragen ◴[] No.43586743{3}[source]
Also, though, how big was Apple Integer BASIC? As I understand it, you had an entire PDP-10 at your disposal when you wrote the Fortran version of Empire.
replies(1): >>43587156 #
1. WalterBright ◴[] No.43587156{4}[source]
I did learn how to program on the -10. A marvelous experience.

Looking backwards, writing an integer basic is a trivial exercise. But back in the 70s, I had no idea how to write such a thing.

Around 1978, Hal Finney (yes, that guy) wrote an integer basic for the Mattel Intellivision (with its wacky 10 bit microprocessor) that fit in a 2K EPROM. Of course, Hal was (a lot) smarter than the average bear.

replies(1): >>43588695 #
2. kragen ◴[] No.43588695[source]
Interesting, I didn't know that! I didn't know him until the 90s, and didn't meet him in person until his CodeCon presentation.

What I was trying to express—perhaps poorly—is that maybe floating-point support would have been more effort than the entire Integer BASIC. (Incidentally, as I understand it, nobody has found a bug in Apple Integer BASIC yet, which makes it a nontrivial achievement from my point of view.)