Abstract
This article describes a new TREC Enterprise Track search test collection -- CERC. The collection is designed to represent some real-world search activity within the enterprise, using as a specific example the Commonwealth Scientific and Industrial Research Organisation (CSIRO). It has a deep crawl of CSIRO's public-facing information, that is very similar to the crawl of a real-world search service provided by CSIRO. The search tasks are based on the activities of CSIRO Science Communicators, who are CSIRO employees that deal with public-facing information. Topics and judgments are tied to the Science Communicators in various ways, for example by involving them in the topic development process. The overall approach is to enhance the validity of the test collection as a model of enterprise search, by tying it to real-world examples.
Publisher
Association for Computing Machinery (ACM)
Subject
Hardware and Architecture,Management Information Systems
Cited by
12 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. What Matters in a Measure? A Perspective from Large-Scale Search Evaluation;Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval;2024-07-10
2. Content-based recommendation for Academic Expert finding;Proceedings of the 5th Spanish Conference on Information Retrieval;2018-06-26
3. PMSC-UGR: A Test Collection for Expert Recommendation Based on PubMed and Scopus;Advances in Artificial Intelligence;2018
4. On Information-Theoretic Document-Person Associations for Expert Search in Academia;Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval;2016-07-07
5. The LExR Collection for Expertise Retrieval in Academia;Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval;2016-07-07