From Distillation to Hard Negative Sampling

Author:

Formal Thibault1,Lassance Carlos2,Piwowarski Benjamin3,Clinchant Stéphane2

Affiliation:

1. Naver Labs Europe / Sorbonne Université, ISIR, Meylan, France

2. Naver Labs Europe, Meylan, France

3. Sorbonne Université, ISIR / CNRS, Paris, France

Publisher

ACM

Reference32 articles.

1. Ricardo A. Baeza-Yates and Berthier A . Ribeiro-Neto . 1999 . Modern Information Retrieval. ACM Press / Addison-Wesley . http://www.ischool.berkeley.edu/~hearst/irbook/glossary.html Ricardo A. Baeza-Yates and Berthier A. Ribeiro-Neto. 1999. Modern Information Retrieval. ACM Press / Addison-Wesley. http://www.ischool.berkeley.edu/~hearst/irbook/glossary.html

2. Nick Craswell , Bhaskar Mitra , Emine Yilmaz , Daniel Campos , and Ellen M Voorhees . 2020. Overview of the trec 2019 deep learning track. arXiv preprint arXiv:2003.07820 ( 2020 ). Nick Craswell, Bhaskar Mitra, Emine Yilmaz, Daniel Campos, and Ellen M Voorhees. 2020. Overview of the trec 2019 deep learning track. arXiv preprint arXiv:2003.07820 (2020).

3. Zhuyun Dai and Jamie Callan . 2020. Context-Aware Term Weighting For First Stage Passage Retrieval . Association for Computing Machinery , New York, NY, USA , 1533--1536. https://doi.org/10.1145/3397271.3401204 Zhuyun Dai and Jamie Callan. 2020. Context-Aware Term Weighting For First Stage Passage Retrieval. Association for Computing Machinery, New York, NY, USA, 1533--1536. https://doi.org/10.1145/3397271.3401204

4. Jacob Devlin , Ming-Wei Chang , Kenton Lee , and Kristina Toutanova . 2018 . BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. CoRR abs/1810.04805 (2018). arXiv:1810.04805 http://arxiv.org/abs/1810.04805 Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. CoRR abs/1810.04805 (2018). arXiv:1810.04805 http://arxiv.org/abs/1810.04805

5. Thibault Formal Benjamin Piwowarski and Stéphane Clinchant. 2021. Match Your Words! A Study of Lexical Matching in Neural Information Retrieval. arXiv:2112.05662 [cs.IR] Thibault Formal Benjamin Piwowarski and Stéphane Clinchant. 2021. Match Your Words! A Study of Lexical Matching in Neural Information Retrieval. arXiv:2112.05662 [cs.IR]

Cited by 55 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Bridging Dense and Sparse Maximum Inner Product Search;ACM Transactions on Information Systems;2024-08-19

2. On Adaptive Knowledge Distillation with Generalized KL-Divergence Loss for Ranking Model Refinement;Proceedings of the 2024 ACM SIGIR International Conference on Theory of Information Retrieval;2024-08-02

3. Weighted KL-Divergence for Document Ranking Model Refinement;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10

4. Faster Learned Sparse Retrieval with Block-Max Pruning;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10

5. Ranked List Truncation for Large Language Model-based Re-Ranking;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3