ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction

콘텐츠

Abstract:Neural information retrieval (IR) has greatly advanced search and other knowledge-intensive language tasks. While many neural IR methods encode queries and documents into single-vector representations, late interaction models produce multi-vector representations at the granularity of each token and decompose relevance modeling into scalable token-level computations. This decomposition has been shown to make late interaction more effective, but it inflates the space footprint of these models by an order of magnitude. In this work, we introduce ColBERTv2, a retriever that couples an aggressive residual compression mechanism with a denoised supervision strategy to simultaneously improve the quality and space footprint of late interaction. We evaluate ColBERTv2 across a wide range of benchmarks, establishing state-of-the-art quality within and outside the training domain while reducing the space footprint of late interaction models by 6--10$\times$.

From: Omar Khattab [view email]
[v1] Thu, 2 Dec 2021 18:38:50 UTC (570 KB)
[v2] Thu, 16 Dec 2021 05:34:49 UTC (573 KB)
[v3] Sun, 10 Jul 2022 17:28:51 UTC (627 KB)

요약하다
The article introduces ColBERTv2, a neural information retrieval model that enhances late interaction by combining residual compression and denoised supervision techniques. This approach improves quality and reduces the space footprint of late interaction models by 6-10 times. ColBERTv2 outperforms existing models on various benchmarks, achieving state-of-the-art results within and beyond the training domain.