←back to thread

76 points unixpickle | 1 comments | | HN request time: 0.291s | source

I made this website with my wife in mind; it makes it possible to browse for similar fashion products over many different retailers at once.

The backend is written in Swift, and is hosted on a single Mac Mini. It performs nearest neighbors on the GPU over ~3M product images.

No vector DB, just pure matrix multiplications. Since we aren't just doing approximate nearest neighbors but rather sorting all results by distance, it's possible to show different "variety" levels by changing the stride over the sorted search results.

Nearest neighbors are computed in a latent vector space. The model which produces the vectors is also something I trained in pure Swift.

The underlying data is about 2TB scraped from https://www.shopltk.com/.

All the code is at https://github.com/unixpickle/LTKlassifier

Show context
fredophile ◴[] No.43376657[source]
Out of curiosity, what's the size of vectors you're using (# of dimensions) and what distance metric are you using? Euclidean?
replies(1): >>43376847 #
1. unixpickle ◴[] No.43376847[source]
To optimize for fast nearest neighbors, I chose 256 dims. Notably, this actually hurt some of the pre-training classification losses pretty severely compared to 2k dims, so it definitely has a quality cost.

The site uses cosine distance. The code itself implements Euclidean distance, but I decided to normalize the vectors last minute out of FUD that some unusually small vectors would appear as neighbors for an abnormal number of examples.