Besides, often you're lucky and there's a trivial perfect hash like modulo.
I also don't understand your first point. We can run n^2 algorithms on massive inputs given its just a polynomial. Are you thinking of 2^n perhaps?
Rockstar infamously wasted five minutes loading GTAV because they had an n^2 algorithm in the startup sequence.
In practice, n^2 sees surprising slowdowns way before that, in the 10k-100k range you could be spending minutes of processing time (10ms for an element would only need ~77 elements to take 1 minute).
Well, it doesn't seem that obvious, at least not to everyone, because several times I've seen people rewrite things to make it O(1) when it was actually slower for the vast majority of cases (or even all cases).
And in many cases it's assumed to be small, but not guaranteed. That's where the trouble lies.
A classic example is Windows using a quadratic algorithm for arranging desktop icons that caused severe performance issues when users had many icons on their desktop.
https://github.com/microsoft/Windows-Dev-Performance/issues/...
It's a perfect hash function for the case where you work with ints and know the maximal min-max range beforehand; then you can modulo by the size of the range as long as it's not too large. In your example 33-21 --> mod 12
This comes up for me surprisingly often but obviously depends a lot on what you work on. It is often tempting to reach for a hashtable, but it's a lot less efficient in this case.