Skip to Main content Skip to Navigation
Conference papers

SPLADE: Sparse Lexical and Expansion Model for First Stage Ranking

Abstract : In neural Information Retrieval, ongoing research is directed towards improving the first retriever in ranking pipelines. Learning dense embeddings to conduct retrieval using efficient approximate nearest neighbors methods has proven to work well. Meanwhile, there has been a growing interest in learning sparse representations for documents and queries, that could inherit from the desirable properties of bag-of-words models such as the exact matching of terms and the efficiency of inverted indexes. In this work, we present a new first-stage ranker based on explicit sparsity regularization and a log-saturation effect on term weights, leading to highly sparse representations and competitive results with respect to state-ofthe-art dense and sparse methods. Our approach is simple, trained end-to-end in a single stage. We also explore the trade-off between effectiveness and efficiency, by controlling the contribution of the sparsity regularization. CCS CONCEPTS • Information systems → Language models.
Document type :
Conference papers
Complete list of metadata

https://hal.sorbonne-universite.fr/hal-03290774
Contributor : Hal Sorbonne Université Gestionnaire <>
Submitted on : Monday, July 19, 2021 - 2:51:04 PM
Last modification on : Wednesday, July 21, 2021 - 3:48:35 AM

File

3404835.3463098.pdf
Publication funded by an institution

Identifiers

Citation

Thibault Formal, Benjamin Piwowarski, Stéphane Clinchant. SPLADE: Sparse Lexical and Expansion Model for First Stage Ranking. SIGIR '21: The 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul 2021, Virtual Event, Canada. pp.2288-2292, ⟨10.1145/3404835.3463098⟩. ⟨hal-03290774⟩

Share

Metrics

Record views

27

Files downloads

20