O(1) in many cases involves a hashing function which is a non-trivial but constant cost. For smaller values of N it can be outperformed in terms of wall clock time by n^2 worst case algorithms.
replies(2):
Besides, often you're lucky and there's a trivial perfect hash like modulo.
Well, it doesn't seem that obvious, at least not to everyone, because several times I've seen people rewrite things to make it O(1) when it was actually slower for the vast majority of cases (or even all cases).
And in many cases it's assumed to be small, but not guaranteed. That's where the trouble lies.
A classic example is Windows using a quadratic algorithm for arranging desktop icons that caused severe performance issues when users had many icons on their desktop.
https://github.com/microsoft/Windows-Dev-Performance/issues/...