> where a mathematician writes "Σ", a programmer simply writes "sum".
Communities develop shorthand and terms of art for things they write a lot. Mathematicians need to write lots of sums; programmers have their own shorthand and terminology.
I think the difference is that you are a programmer and not a mathematician (I'm guessing) and are saying, effectively, that what you are subjectively familiar with is objectively more universally understood.
Are you saying special symbols aren't more common in mathematics than in programming? I simply disagree. Mathematicians hardly use strings at all, e.g. for function names or variables, while they are very common in programming. Mathematicians mostly use single letters in Roman or Greek alphabet, and sometimes with various strange styles like fraktur, double strokes etc.
No, I agree that programming uses more ASCII. I'm saying that using a smaller alphabet (e.g., hex), doesn't make it easier to understand. Programming is just as arcane and difficult to understand - even programmers have trouble understanding each other's code, and generally it's believed that the understanding requires documentation in English.