On the other hand, people experienced in mathematics find silly to use a whole bunch of arbitrary composed symbols to replace the traditional mathematics symbols, some of which have been used for centuries, only because a half of century ago the ASCII character set standardized by Americans, for which the priority was its suitability for commercial correspondence written in English, not for mathematical formulae or programming languages, and previously the even more restricted character sets implemented by the IBM peripherals, have forced the designers of programming languages to substitute the classic mathematical symbols with whatever was available.
Now when Unicode is available, it is possible to design a new programming language that uses any desirable symbols.
However the most important legacy programming languages are still specified to use only ASCII, with the possible exception of identifiers. Ligatures allow one to see whatever symbols are preferred, while still delivering to the compiler ASCII text.
This tradition of using very restricted character sets has been imposed on the world by USA, and especially by IBM, to reduce their manufacturing costs in the early years of computer technology. Its consequences have been both the difficulties of using in computer applications the letters with diacritics that are essential for most non-English languages and also of writing mathematical formulae.
In the beginning, the programming languages partially or totally designed in Europe, like ALGOL 60 or CPL, were using much more mathematical symbols than available in the American computers. Because of this, eventually they had to be transliterated in order to be able to make compilers for them that could run on IBM computers or the like.