←back to thread

50 points fagnerbrack | 1 comments | | HN request time: 0.417s | source
Show context
sxp ◴[] No.42189133[source]
Do programmers use those weird font ligatures in practice? E.g, Rendering `a != b` as `a≠b`. I've only seen them used by people who want to show how far they can push coding style away from the standard monospace low ASCII, but haven't seen any good justification for them.
replies(5): >>42189260 #>>42192111 #>>42192596 #>>42194228 #>>42197475 #
1. adrian_b ◴[] No.42192596[source]
People who have little experience in mathematics are content with symbols like "!=".

On the other hand, people experienced in mathematics find silly to use a whole bunch of arbitrary composed symbols to replace the traditional mathematics symbols, some of which have been used for centuries, only because a half of century ago the ASCII character set standardized by Americans, for which the priority was its suitability for commercial correspondence written in English, not for mathematical formulae or programming languages, and previously the even more restricted character sets implemented by the IBM peripherals, have forced the designers of programming languages to substitute the classic mathematical symbols with whatever was available.

Now when Unicode is available, it is possible to design a new programming language that uses any desirable symbols.

However the most important legacy programming languages are still specified to use only ASCII, with the possible exception of identifiers. Ligatures allow one to see whatever symbols are preferred, while still delivering to the compiler ASCII text.

This tradition of using very restricted character sets has been imposed on the world by USA, and especially by IBM, to reduce their manufacturing costs in the early years of computer technology. Its consequences have been both the difficulties of using in computer applications the letters with diacritics that are essential for most non-English languages and also of writing mathematical formulae.

In the beginning, the programming languages partially or totally designed in Europe, like ALGOL 60 or CPL, were using much more mathematical symbols than available in the American computers. Because of this, eventually they had to be transliterated in order to be able to make compilers for them that could run on IBM computers or the like.

replies(1): >>42192838 #