←back to thread

244 points rbanffy | 1 comments | | HN request time: 0.21s | source
Show context
POSSIBLE_FACT ◴[] No.44603645[source]
Absolutely loved when I randomly caught an episode of Computer Chronicles back in the old time days.
replies(2): >>44603765 #>>44608696 #
rbanffy ◴[] No.44603765[source]
I think that, by now, I have watched every episode. He was the Bill Gates we needed.
replies(3): >>44603804 #>>44603845 #>>44604028 #
whobre ◴[] No.44603845[source]
He was nothing like BG. Gary was an inventor, educator and most of all a visionary. He hated running a business, even though he started DRI after failing to convince Intel to buy CP/M.

Yes, there are quite a few videos on YouTube about him, named “The man who should have been Bill Gates” but that’s just click baiting. Watch the special episode of “The Computer Chronicles” about Gary Kildall and see what his friends and business associates say about him.

replies(7): >>44603943 #>>44603983 #>>44604145 #>>44604163 #>>44604595 #>>44604601 #>>44604876 #
terabyterex ◴[] No.44604145[source]
This paints Bill Gates as not a tech person and a business first person, which is not true. He got a BASIC compiler on the altair which MITS thought couldn't be done. He helped Wozniak implement a version of BASIC supporting floating point numbers. Gates didn't even want to take Microsoft public. They had to convince him. Ballmer was the biggest businessman in the bunch. Hell, he was the one that suggested kidall since Microsoft wasn't in the OS business.
replies(3): >>44604243 #>>44604605 #>>44604689 #
rbanffy ◴[] No.44604605[source]
> BASIC compiler

Interpreter - an entirely different kind of animal. Microsoft didn't get a BASIC compiler until much later.

> He helped Wozniak implement a version of BASIC supporting floating point numbers.

No. He sold Apple a BASIC, then used it as leverage to prevent Apple from making a BASIC for the Macintosh.

> Ballmer was the biggest businessman in the bunch.

He suggested cutting Paul Allen's family off when Allen was battling cancer.

replies(1): >>44606194 #
WalterBright ◴[] No.44606194[source]
Um, it is necessary to compile a program before being able to interpret it. I don't know how early BASICs were implemented, but the usual method is to compile it to some sort of intermediate representation, and then interpret that representation.

D's compile time function execution engine works that way. So does the Javascript compiler/interpreter engine I wrote years ago, and the Java compiler I wrote eons ago.

The purpose to going all the way to generating machine code is the result often runs 10x faster.

replies(5): >>44607379 #>>44607380 #>>44607445 #>>44608493 #>>44609120 #
1. tasty_freeze ◴[] No.44608493[source]
You have an idiosyncratic definition of "compiler" then. Many BASICs, including the MS family of BASICs, did tokenize keywords to save on memory storage.

But 99.9% of people take "compiler" to mean translating source code to either a native CPU instruction set or a VM instruction set. In any tutorial on compilers, tokenization is only one aspect of compilation, as you know very well. And unlike some of the tricky tokenization aspects that crop up in languages like C++, BASIC interpreters simply had a table of keywords with the MSB set to indicate boundaries between keywords. The tokenizer simply did greedy "first token which matches the next few characters" is the winner, and encoded the Nth entry from that table as token (0x80 + N).

When LIST'ing a program, the same table was used: if the byte was >= 0x80, then the first N-1 keywords in the table were skipped over and the next one was printed out.

There were also BASIC implementations that did not tokenize anything; every byte was simply interpreted on every execution of the line. There were tiny BASICs where instead of using the full keyword "PR" meant "PRINT", and "GO" meant "GOTO" etc.