←back to thread

288 points Twirrim | 1 comments | | HN request time: 0.238s | source
Show context
pjdesno ◴[] No.41874875[source]
During an internship in 1986 I wrote C code for a machine with 10-bit bytes, the BBN C/70. It was a horrible experience, and the existence of the machine in the first place was due to a cosmic accident of the negative kind.
replies(6): >>41874970 #>>41875234 #>>41875248 #>>41875733 #>>41875834 #>>41876076 #
Isamu ◴[] No.41876076[source]
I wrote code on a DECSYSTEM-20, the C compiler was not officially supported. It had a 36-bit word and a 7-bit byte. Yep, when you packed bytes into a word there were bits left over.

And I was tasked with reading a tape with binary data in 8-bit format. Hilarity ensued.

replies(2): >>41876112 #>>41876915 #
bee_rider ◴[] No.41876112[source]
Hah. Why did they do that?
replies(1): >>41877442 #
1. mjevans ◴[] No.41877442[source]
Which part of it?

8 bit tape? Probably the format the hardware worked in... not actually sure I haven't used real tapes but it's plausible.

36 bit per word computer? Sometimes 0..~4Billion isn't enough. 4 more bits would get someone to 64 billion, or +/- 32 billion.

As it turns out, my guess was ALMOST correct

https://en.wikipedia.org/wiki/36-bit_computing

Paraphrasing, legacy keying systems were based on records of up to 10 printed decimal digits of accuracy for input. 35 bits would be required to match the +/- input but 36 works better as a machine word and operations on 6 x 6 bit (yuck?) characters; or some 'smaller' machines which used a 36 bit larger word and 12 or 18 bit small words. Why the yuck? That's only 64 characters total, so these systems only supported UPPERCASE ALWAYS numeric digits and some other characters.