←back to thread

86 points alexop | 1 comments | | HN request time: 0s | source
Show context
raverbashing ◴[] No.43329726[source]
Meanwhile in Python this is just

    (a @ b.T)/(np.linalg.norm(a)*np.linalg.norm(b))
replies(1): >>43330264 #
forgotpwd16 ◴[] No.43330264[source]
Because you use numpy. Could as well import cosine_similarity from sklearn.
replies(1): >>43331468 #
1. Etherlord87 ◴[] No.43331468[source]
you could also normalize (divide all components by magnitude) both vectors and simply take the dot product?