Human civilisation means intelligence and memory are collective, externalised, persistent, communicable. There's also a layer of symbolic abstraction (science and math) which makes it possible to predict useful consequences with some precision.
Individuals die but their inventions and insights remain. Individuals can also specialise, which is a kind of civilisational divide and conquer strategy.
Most animals don't have that. Some do train their young to a limited extent, but without writing the knowledge doesn't persist. And without abstraction it only evolves extremely slowly, if at all.
They have to reinvent the wheel over and over, which means they never invent the wheel at all.
We actually have this problem with politics and relationships. We keep making the same mistakes because the humanities provide some limited memory, but there's no symbolic abstraction and prediction - just story telling, which is far less effective.
Bonus points: I often wonder if there's a level of complexity beyond our kind of intelligence, and what it might look like. Abstraction of abstraction would be meta-learning - symbolic systems that manipulate the creation and distribution of civilisational learning.
AI seems to be heading in that direction.
There may be further levels, but we can't imagine them. We could be embedded in them and we wouldn't see them for what they are.