Affiliation:
1. University of Delaware, Newark, DE, USA
2. University of Waterloo, Waterloo, ON, Canada
Abstract
This work tackles the perennial problem of reproducible baselines in information retrieval research, focusing on bag-of-words ranking models. Although academic information retrieval researchers have a long history of building and sharing systems, they are primarily designed to facilitate the publication of research papers. As such, these systems are often incomplete, inflexible, poorly documented, difficult to use, and slow, particularly in the context of modern web-scale collections. Furthermore, the growing complexity of modern software ecosystems and the resource constraints most academic research groups operate under make maintaining open-source systems a constant struggle. However, except for a small number of companies (mostly commercial web search engines) that deploy custom infrastructure, Lucene has become the
de facto
platform in industry for building search applications. Lucene has an active developer base, a large audience of users, and diverse capabilities to work with heterogeneous collections at scale. However, it lacks systematic support for
ad hoc
experimentation using standard test collections. We describe Anserini, an information retrieval toolkit built on Lucene that fills this gap. Our goal is to simplify
ad hoc
experimentation and allow researchers to easily reproduce results with modern bag-of-words ranking models on diverse test collections. With Anserini, we demonstrate that Lucene provides a suitable framework for supporting information retrieval research. Experiments show that our system efficiently indexes large web collections, provides modern ranking models that are on par with research implementations in terms of effectiveness, and supports low-latency query evaluation to facilitate rapid experimentation
Funder
National Science Foundation
Natural Sciences and Engineering Research Council of Canada
Publisher
Association for Computing Machinery (ACM)
Subject
Information Systems and Management,Information Systems
Cited by
118 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. A novel re-ranking architecture for patent search;World Patent Information;2024-09
2. Query Variability and Experimental Consistency: A Concerning Case Study;Proceedings of the 2024 ACM SIGIR International Conference on Theory of Information Retrieval;2024-08-02
3. Multi-granular Adversarial Attacks against Black-box Neural Ranking Models;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
4. Resources for Brewing BEIR: Reproducible Reference Models and Statistical Analyses;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
5. Utilizing passage‐level relevance and kernel pooling for enhancing BERT‐based document reranking;Computational Intelligence;2024-06