"Attention Is All You Need" - I've always wondered if the authors of that paper used such a casual and catchy title because they knew it would be groundbreaking and massively cited in the future....
The transformer was a major breakthrough in NLP, and it was clear at the time of publishing that it would have a major impact. But I will add that it is common in the Deep Learning field to give papers catchy titles (see, off the top of my head: all the YOLO papers, ViT, DiT, textual inversion). The transformer paper is one in a long line of seminal papers with funny names.