Most active commenters
  • WalterBright(7)
  • wvenable(4)

←back to thread

244 points rbanffy | 22 comments | | HN request time: 0.002s | source | bottom
Show context
POSSIBLE_FACT ◴[] No.44603645[source]
Absolutely loved when I randomly caught an episode of Computer Chronicles back in the old time days.
replies(2): >>44603765 #>>44608696 #
rbanffy ◴[] No.44603765[source]
I think that, by now, I have watched every episode. He was the Bill Gates we needed.
replies(3): >>44603804 #>>44603845 #>>44604028 #
whobre ◴[] No.44603845[source]
He was nothing like BG. Gary was an inventor, educator and most of all a visionary. He hated running a business, even though he started DRI after failing to convince Intel to buy CP/M.

Yes, there are quite a few videos on YouTube about him, named “The man who should have been Bill Gates” but that’s just click baiting. Watch the special episode of “The Computer Chronicles” about Gary Kildall and see what his friends and business associates say about him.

replies(7): >>44603943 #>>44603983 #>>44604145 #>>44604163 #>>44604595 #>>44604601 #>>44604876 #
terabyterex ◴[] No.44604145[source]
This paints Bill Gates as not a tech person and a business first person, which is not true. He got a BASIC compiler on the altair which MITS thought couldn't be done. He helped Wozniak implement a version of BASIC supporting floating point numbers. Gates didn't even want to take Microsoft public. They had to convince him. Ballmer was the biggest businessman in the bunch. Hell, he was the one that suggested kidall since Microsoft wasn't in the OS business.
replies(3): >>44604243 #>>44604605 #>>44604689 #
1. rbanffy ◴[] No.44604605[source]
> BASIC compiler

Interpreter - an entirely different kind of animal. Microsoft didn't get a BASIC compiler until much later.

> He helped Wozniak implement a version of BASIC supporting floating point numbers.

No. He sold Apple a BASIC, then used it as leverage to prevent Apple from making a BASIC for the Macintosh.

> Ballmer was the biggest businessman in the bunch.

He suggested cutting Paul Allen's family off when Allen was battling cancer.

replies(1): >>44606194 #
2. WalterBright ◴[] No.44606194[source]
Um, it is necessary to compile a program before being able to interpret it. I don't know how early BASICs were implemented, but the usual method is to compile it to some sort of intermediate representation, and then interpret that representation.

D's compile time function execution engine works that way. So does the Javascript compiler/interpreter engine I wrote years ago, and the Java compiler I wrote eons ago.

The purpose to going all the way to generating machine code is the result often runs 10x faster.

replies(5): >>44607379 #>>44607380 #>>44607445 #>>44608493 #>>44609120 #
3. eichin ◴[] No.44607379[source]
> necessary to compile

Um, no? your experience is probably at least two decades after the time period in question.. The more advanced versions of, for example, the TRS-80 BASIC (part of this "microcomputer BASICs that all share a common set of bugs") did no more than tokenize - so, `10 PRINT "Hello"` would have a binary representation for the line number, a single byte token for PRINT, then " H E L L O " and an end-of-line marker. Actually interpreting the code involved just reading it linearly; GOTO linenumber involved scanning the entire code in memory for that line number (and yes, people really did optimize things by putting GOTO and GOSUB targets earlier in the program so the interpreter would find them faster :-)

replies(2): >>44607465 #>>44608213 #
4. stevekemp ◴[] No.44607380[source]
It is not necessary to compile a program, in the general case, before executing it.

Many programming languages parse their program to an AST then walk that AST interpretting as they go. But for BASIC you can parse/execute statement by statement - no need to parse the whole program ahead of time, and certainly zero need to compile to either machine code or any internal representation.

Remember at the time we're talking about 64k was a lot of RAM. Some machines had less.

replies(1): >>44608195 #
5. ◴[] No.44607445[source]
6. EvanAnderson ◴[] No.44607465{3}[source]
I was going to post this, but you beat me to it.

It's a VM of a sort, and the p-code the VM executes is tokenized input.

7. WalterBright ◴[] No.44608195{3}[source]
The parsing, even if line by line as necessary, is still compiling.
replies(1): >>44608788 #
8. WalterBright ◴[] No.44608213{3}[source]
Tokenizing it and interpreting the token stream is still a compilation process. Even if it re-tokenized it each time it executed a line.
replies(1): >>44609173 #
9. tasty_freeze ◴[] No.44608493[source]
You have an idiosyncratic definition of "compiler" then. Many BASICs, including the MS family of BASICs, did tokenize keywords to save on memory storage.

But 99.9% of people take "compiler" to mean translating source code to either a native CPU instruction set or a VM instruction set. In any tutorial on compilers, tokenization is only one aspect of compilation, as you know very well. And unlike some of the tricky tokenization aspects that crop up in languages like C++, BASIC interpreters simply had a table of keywords with the MSB set to indicate boundaries between keywords. The tokenizer simply did greedy "first token which matches the next few characters" is the winner, and encoded the Nth entry from that table as token (0x80 + N).

When LIST'ing a program, the same table was used: if the byte was >= 0x80, then the first N-1 keywords in the table were skipped over and the next one was printed out.

There were also BASIC implementations that did not tokenize anything; every byte was simply interpreted on every execution of the line. There were tiny BASICs where instead of using the full keyword "PR" meant "PRINT", and "GO" meant "GOTO" etc.

10. vidarh ◴[] No.44608788{4}[source]
In 45 years of writing software, I've never before seen anyone call tokenizing a BASIC program compilation. It's decidedly not common usage.
replies(1): >>44608967 #
11. WalterBright ◴[] No.44608967{5}[source]
I've been writing compilers for 45 years now. Tokenizing is a big part of every textbook on compilers. To resolve expressions (which are recursive in nature) it would have had to do more than just tokenizing. While this isn't hard at all, it's "parsing" which is also qualifying it as a compiler.

I.e. the basic program was lexing and parsing. It's a compiler. A very simple one, sure, but a compiler.

replies(2): >>44609101 #>>44609849 #
12. vidarh ◴[] No.44609101{6}[source]
Yes, but tokenization on its own is not compilation any more than whiskers are a cat just because a cat has them.

"Nobody" uses it that way, and language is defined by use.

13. wvenable ◴[] No.44609120[source]
Early BASICs didn't compile a program before interpreting it. The interpreter read the code as written and executed it step-by-step. There was some tokenization; keywords were turned into single or double bytes and that was literally done when you pressed enter on the keyboard. Your source code was these actual tokenized bytes. On the Commodore 64, you could type the tokenized versions of keywords instead of the full keyword as a shortcut. Even numbers were not transformed into bytes ahead of time.

This was used to save memory -- there wasn't much room to hold both the source code and an intermediate form. But also it wasn't that necessary, with the keywords tokenized and the syntax so simple that there wouldn't have been much savings in space or performance.

14. wvenable ◴[] No.44609173{4}[source]
Tokenizing is a necessary but not a sufficient task for compilation. I could tokenize this comment to efficiently store it in a database but that would have nothing to do with compilation.
replies(1): >>44611430 #
15. jacquesm ◴[] No.44609849{6}[source]
Compilers generate code in another, usually lower level language that is executed by reading all of the code that could be executed first. Interpreters (such as the BASIC interpreter we are discussing here) read only that part of the code that gets executed and typically call functions rather than that they generate code (never mind JIT). Tokenization prior to interpretation is technically an optional step (it's just an efficiency boost) and is not normally confused with compilation even if there are some superficial similarities.

You of all people should know this, come on.

replies(1): >>44611393 #
16. WalterBright ◴[] No.44611393{7}[source]
You and I have a different point of view.
replies(1): >>44612856 #
17. WalterBright ◴[] No.44611430{5}[source]
Recognizing `3+x*(2+y)` is compilation - even if the program is being executed while compiling it.
replies(1): >>44611928 #
18. wvenable ◴[] No.44611928{6}[source]
You can continue to argue the point but that goes against every single definition of compilation that exists. Compilation is a transformation of programming language into another form. For example, taking `3+x*(2+y)` and transforming it into a series of byte codes, machine language instructions, ASTs, or even C code would be compilation.

The BASIC interpreter doesn't recognize `3+x*(2+y)` nor does it compile it instead it evaluates that expression using a pair of stacks. You've expanded the definition of compilation to cover almost all computation. It's compilers all the way down to the electrons.

replies(2): >>44612604 #>>44613266 #
19. WalterBright ◴[] No.44612604{7}[source]
Well, all my compilers use recursive descent for expressions, meaning the stack is used to maintain the current state. Whether you evaluate it while doing this or produce an IR is a trivial difference.
replies(1): >>44612725 #
20. wvenable ◴[] No.44612725{8}[source]
It might be a trivial difference but it's literally the thing that makes something a compiler. It should make sense. How a piece of software works doesn't make it a compiler or not; it's input and output of the software that defines it as a compiler. That's true of almost any broad category of software.
21. jacquesm ◴[] No.44612856{8}[source]
Of all the hills you could die on this one seems really silly.
22. eichin ◴[] No.44613266{7}[source]
No problem calling it parsing, but yeah, "compilation" feels like a huge stretch. And they didn't do recursive descent - just tokenizing for compactness (when you only have 4k or 16k of RAM you do things like that) - you could still get syntax errors at runtime. In some interpreters it also served to normalize abbreviations to save typing.