Affiliation:
1. School of Information and Library Science, University of North Carolina at Chapel Hill, Chapel Hill, NC
2. School of Information and Library Science, University of North Carolina at Chapel Hill Chapel Hill, NC
Abstract
We report on a crowdsourced study that investigated how two factors influence the way people formulate information requests. Our first factor,
medium
, considers whether the request is produced using text or voice. Our second factor,
target
, considers whether the request is intended for a search engine or a human intermediary (i.e., someone who will search on the user’s behalf). In particular, we study how these two factors influence the way people formulate requests in situations where the information need has a specific type of
extra-topical
dimension (i.e., a type of constraint that is independent from the information need’s topic). We focus on six extra-topical dimensions: (1) domain knowledge, (2) viewpoint, (3) experiential, (4) venue location, (5) source location, and (6) temporal. The extra-topical dimension was manipulated by giving participants carefully constructed search tasks. We analyzed a large number of information requests produced by study participants, and address three research questions. We study the effects of our two factors (medium and target) on (RQ1) participants’ perceptions about their own information requests, (RQ2) the different characteristics of their information requests (e.g., natural language structure, retrieval performance), and (RQ3) participants’ strategies for requesting information when the search task has a specific type of extra-topical dimension. Our results found that both factors influenced participants’ perceptions about their own information requests, the characteristics of participants’ requests, and the strategies adopted by participants to request information matching the extra-topical dimension. Our results have implications for future research on methods that can harness (rather than ignore) extra-topical query terms to retrieve relevant information.
Funder
National Science Fundation
Publisher
Association for Computing Machinery (ACM)
Subject
Computer Science Applications,General Business, Management and Accounting,Information Systems
Reference34 articles.
1. Jaime Arguello Sandeep Avula and Fernando Diaz. 2016. Using query performance predictors to improve spoken queries. In ECIR. Springer. Jaime Arguello Sandeep Avula and Fernando Diaz. 2016. Using query performance predictors to improve spoken queries. In ECIR. Springer.
2. Jaime Arguello Sandeep Avula and Fernando Diaz. 2017. Using query performance predictors to reduce spoken queries. In ECIR. Springer. Jaime Arguello Sandeep Avula and Fernando Diaz. 2017. Using query performance predictors to reduce spoken queries. In ECIR. Springer.
3. Peter Bailey Alistair Moffat Falk Scholer and Paul Thomas. 2015. User variability and IR system evaluation. In SIGIR. ACM 625--634. 10.1145/2766462.2767728 Peter Bailey Alistair Moffat Falk Scholer and Paul Thomas. 2015. User variability and IR system evaluation. In SIGIR. ACM 625--634. 10.1145/2766462.2767728
4. User-defined relevance criteria: An exploratory study;Barry Carol L.;Journal of the Association for Information Science and Technology,1994
5. Carol L. Barry and Linda Schamber. 1998. Users’ criteria for relevance evaluation: A cross-situational comparison. IP8M 34 2--3 (1998) 219--236. 10.1016/S0306-4573(97)00078-2 Carol L. Barry and Linda Schamber. 1998. Users’ criteria for relevance evaluation: A cross-situational comparison. IP8M 34 2--3 (1998) 219--236. 10.1016/S0306-4573(97)00078-2
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献