Affiliation:
1. Indian Statistical Institute
Abstract
The aim of the Forum for Information Retrieval Evaluation (FIRE) is to create an evaluation framework in the spirit of TREC (Text REtrieval Conference), CLEF (Cross-Language Evaluation Forum), and NTCIR (NII Test Collection for IR Systems), for Indian language Information Retrieval. The first evaluation exercise conducted by FIRE was completed in 2008. This article describes the test collections used at FIRE 2008, summarizes the approaches adopted by various participants, discusses the limitations of the datasets, and outlines the tasks planned for the next iteration of FIRE.
Publisher
Association for Computing Machinery (ACM)
Cited by
16 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. CIRAL: A Test Collection for CLIR Evaluations in African Languages;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
2. CIRAL at FIRE 2023: Cross-Lingual Information Retrieval for African Languages;Proceedings of the 15th Annual Meeting of the Forum for Information Retrieval Evaluation;2023-12-15
3. HC3: A Suite of Test Collections for CLIR Evaluation over Informal Text;Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval;2023-07-18
4. Gender tagging of named entities using retrieval‐assisted multi‐context aggregation: An unsupervised approach;Journal of the Association for Information Science and Technology;2023-01-27
5. HC4: A New Suite of Test Collections for Ad Hoc CLIR;Lecture Notes in Computer Science;2022