To my naive eye, It seems like moving to 10 bits per byte would be both logical and make learning the trade just a little bit easier?
To my naive eye, It seems like moving to 10 bits per byte would be both logical and make learning the trade just a little bit easier?
On the other hand, if computing settled on a three-valued logic (e.g. 0/1/«something» where «something» has been proposed as -1, «undefined»/«unknown»/«undecided» or a «shade of grey»), we would have had 9 bit bytes (a power of three).
10 was tried numerous times at the dawn of computing and… it was found too unwieldy in the circuit design.
Is this true? 4 ternary bits give you really convenient base 12 which has a lot of desirable properties for things like multiplication and fixed point. Though I have no idea what ternary building blocks would look like so it’s hard to visualize potential hardware.
Another part of it is the fact that it's a lot easier to represent stuff with hex if the bytes line up.
I can represent "255" with "0xFF" which fits nice and neat in 1 byte. However, now if a byte is 10bits that hex no longer really works. You have 1024 values to represent. The max value would be 0x3FF which just looks funky.
Coming up with an alphanumeric system to represent 2^10 cleanly just ends up weird and unintuitive.
I have certainly heard an argument that ternary logic would have been a better choice, if it won over, but it is history now, and we are left with the vestiges of the ternary logic in SQL (NULL values which are semantically «no value» / «undefined» values).