Affiliation:
1. University of Chinese Academy of Sciences, Beijing, China
2. Institute of Software, Chinese Academy of Sciences, Beijing, China
Abstract
Dense retrieval (DR) has extended the employment of pre-trained language models, like BERT, for text ranking. However, recent studies have raised the robustness issue of DR model against query variations, like query with typos, along with non-trivial performance losses. Herein, we argue that it would be beneficial to allow the DR model to learn to align the relative positions of query-passage pairs in the representation space, as query variations cause the query vector to drift away from its original position, affecting the subsequent DR effectiveness. To this end, we propose RoDR, a novel robust DR model that learns to calibrate the in-batch local ranking of query variation to that of original query for the DR space alignment. Extensive experiments on MS MARCO and ANTIQUE datasets show that RoDR significantly improves the retrieval results on both the original queries and different types of query variations. Meanwhile, RoDR provides a general query noise-tolerate learning framework that boosts the robustness and effectiveness of various existing DR models. Our code and models are openly available at https://github.com/cxa-unique/RoDR.
Publisher
International Joint Conferences on Artificial Intelligence Organization
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. An Empirical Study on Code Search Pre-trained Models: Academic Progresses vs. Industry Requirements;Proceedings of the 15th Asia-Pacific Symposium on Internetware;2024-07-24
2. Robust Information Retrieval;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
3. Typos-aware Bottlenecked Pre-Training for Robust Dense Retrieval;Proceedings of the Annual International ACM SIGIR Conference on Research and Development in Information Retrieval in the Asia Pacific Region;2023-11-26
4. Black-box Adversarial Attacks against Dense Retrieval Models: A Multi-view Contrastive Learning Method;Proceedings of the 32nd ACM International Conference on Information and Knowledge Management;2023-10-21
5. MIRS: [MASK] Insertion Based Retrieval Stabilizer for Query Variations;Lecture Notes in Computer Science;2023