To my naive eye, It seems like moving to 10 bits per byte would be both logical and make learning the trade just a little bit easier?
To my naive eye, It seems like moving to 10 bits per byte would be both logical and make learning the trade just a little bit easier?
On the other hand, if computing settled on a three-valued logic (e.g. 0/1/«something» where «something» has been proposed as -1, «undefined»/«unknown»/«undecided» or a «shade of grey»), we would have had 9 bit bytes (a power of three).
10 was tried numerous times at the dawn of computing and… it was found too unwieldy in the circuit design.
Is this true? 4 ternary bits give you really convenient base 12 which has a lot of desirable properties for things like multiplication and fixed point. Though I have no idea what ternary building blocks would look like so it’s hard to visualize potential hardware.
I have certainly heard an argument that ternary logic would have been a better choice, if it won over, but it is history now, and we are left with the vestiges of the ternary logic in SQL (NULL values which are semantically «no value» / «undefined» values).