Releases: alexklibisz/elastiknn
Releases · alexklibisz/elastiknn
0.1.0-PRE28
- Introduced multiprobe L2 LSH. It's a small change. Search for
probes
in the API docs. - Bug fix for an edge case in approximate queries.
0.1.0-PRE27
- Changed to GPLv3 license. See: https://github.com/alexklibisz/elastiknn/blob/master/LICENSE.txt.
0.1.0-PRE26
- Added Permutation Lsh model and query, based on paper Large Scale Image Retrieval with Elasticsearch by Amato, et. al.
- Several internal improvements, including support for LSH models with repeated hashes.
0.1.0-PRE25
- Performance improvements for LSH queries. 1.5-2x faster on regular benchmarks with randomized data. See PR #114.
0.1.0-PRE24
- Fixed error with KNN queries against vectors that are stored in nested fields, e.g.
outer.inner.vec
.
0.1.0-PRE23
- Switched LSH parameter names to more canonical equivalents:
bands -> L
,rows -> k
,
based on the LSH wikipedia article
and material from Indyk, et. al, e.g. these slides. - Added a
k
parameter to Hamming LSH model, which lets you concatenate > 1 bits to form a single hash value.
0.1.0-PRE22
- Switched scala client to store the ID as a doc-value field. This avoids decompressing the document source
when reading results, which is about 40% faster on benchmarks for both exact and approx. search.
0.1.0-PRE21
- Re-implemented LSH and sparse-indexed queries using an optimized custom Lucene query based on the TermInSetQuery.
This is 3-5x faster on LSH benchmarks. - Updated L1, and L2 similarities such that they're bounded in [0,1].
0.1.0-PRE20
- Added an option for LSH queries to use the more-like-this heuristics to pick a subset of LSH hashes to retrieve candidate vectors.
Uses Lucene's MoreLikeThis class
to pick a subset of hashes based on index statistics. It's generally much faster than using all of the hashes,
yields comparable recall, but is still disabled by default.
0.1.0-PRE19
- Omitting norms in LSH and sparse indexed queries.
This shaves ~15% of runtime off of a sparse indexed benchmark.
Results for LSH weren't as meaningful unfortunately.