As far as I'm concerned universities lost the moral high ground when they prioritized ideology over truth-seeking, elevated identity over excellence, ostracized political outsiders, and lost all viewpoint diversity.
Which are not things they did.
Does it matter if they did or didn't? Universities have indisputably lost the mandate of heaven, have they not? Arguing over whether they actually did any of those things is irrelevant, if a politically powerful group of people think they did! None of them have an objective definition, so it's going to come down to values, and universities / academics as a class have alienated themselves from a substantial portion of the population.