Author:
Ting Pang Wei,Rajagopal Prabha,Wang Mengjia,Zhang Shuxiang,Devi Ravana Sri
Abstract
Abstract
Experimental or relevance assessment cost as well as reliability of an information retrieval (IR) evaluation is highly correlated to the number of topics used. The need of many assessors to produce equivalent large relevance judgments often incurs high cost and time. So, large number of topics in retrieval experiment is not practical and economical. This experiment proposes an approach to identify most effective topics in evaluating IR systems with regards to topic difficulty. The proposed approach is capable of identifying which topics and topic set size are reliable when evaluating system effectiveness. Easy topics appeared to be most suitable for effectively evaluating IR systems.
Subject
General Physics and Astronomy
Reference22 articles.
1. A few good topics: Experiments in topic set reduction for retrieval evaluation;Guiver;ACM Transactions on Information Systems (TOIS),2009
2. Estimating reliability of the retrieval systems effectiveness rank based on performance in multiple experiments;Zhang,2016
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献