>>Show me metrics that move something tangible, like conversion rates. If you can't do that, we both know why.
Personally, I'm pretty cynical on metrics like CTR as well. I think people in this discussion have presented plenty of reasonable criticism on the metrics used in the article, and I don't really disagree. But Google's culture of design by metrics at originates with metrics like conversion rates. I am personally fond of Material 1 but I don't think that the extensive A/B testing really made any real improvements to the design system. What's more, design by testing can really make a mess of things. You can easily end up chasing noise, creating irrelevant p-hacks, pursuing local optima, and tanking the long term experience in favor of short term gains. And you are blinding yourself to these problems with a very precise number.
In market research the concept of the NPS score is instructive: a 100 question survey tells you very little for a variety of methodological issues. Instead just ask the basic question "would you recommend this product to a friend - yes, no maybe?". It's about as precise as the thing it is trying to measure. Something in that style is probably the best way to quantitatively evaluate UX design. More practically though, qualitative research is probably going to be telling you a lot more than a simple number (unless you are actually making ads, in which case, get the CRT as high as possible).