←back to thread

133 points ksec | 1 comments | | HN request time: 1.297s | source
1. haolez ◴[] No.41925395[source]
I have a doubt about vector searches. Does increasing the amount of dimensions always improve performance? Or is there an optimal (application dependant) point where, beyond that, performance starts to degrade?

I'm asking this because most of the vector databases have tight limits on max dimensions and OpenAI's biggest embeddings model has something like 3,000 dimensions. I was wondering if there is something to gain if, for example, OpenAI releases a new embeddings model with 8,000 dimensions or more.