Vector Similarity Metrics
The vector search Metric defines the distance measure between vectors during the vector search. From a mathematical point of view, the K nearest neighbors are those with the smallest distance from the search query vector.
In vector search higher score reflects greater similarity, or smaller distance. Thus, a algebraic conversion from distance to score is required. This relation is given in the Hyperspace Score column.
Metric | Distance Measure | Hyperspace Score |
---|---|---|
l2 | ||
ip | ||
hamming |
Euclidean Metric (l2)
The metric quantifies the similarity as the distance between two points in a Euclidean space. The metric is derived from the Pythagorean theorem and represents the length of the shortest path between the two points:
Note that the metric is a special case of the metric -
Hyperspace uses the squared metric for calculation efficiency. This does not affect the order of the candidates.
Inner Product (ip)
The inner product metric quantifies the similarity between two vectors as -1 times the projection of one vector on the other, which is -1 if the vectors are parallel and 0 if they are perpendicular. The minus sign ensures that minimal distance corresponds to maximum similarity. In a Euclidean vector space, the inner product is:
Vectors must be normalized before using the IP metric.
Hamming Distance (hamming)
The Hamming distance metric quantifies the similarity between two lists by measuring the number of positions at which the corresponding symbols differ. The metric operates over binary strings, and counts the bits that are different between two binary vectors. The lower the Hamming distance, the more similar the strings are considered to be, and vice versa.
The hamming distance is given by
where is the Kronecker delta. The above formula counts the number of characters that are different between and .
Last updated