One thing TI (Extended) Basic had for it that was almost unique among early home computers was its use of decimal floating point with 13 digits precision. It was so useful for maths. I used it a lot at that time in high school. When I switched to an Apple II with its 5 byte binary floats, man was it a disappointment. It was faster, yes, but, boy oh boy, what a catastrophic loss of precision.
replies(1):