Abstract
PurposeThis paper seeks to overcome the mismatch between evaluation reports and the expectations of the target audience, by identifying crisis management professionals' expectations.Design/methodology/approachAn adapted stakeholder information analysis was used to survey the expectations of 84 crisis management professionals in the Netherlands. A general inductive analysis was applied to qualitative data, from which five main themes emerged: purpose; object or focus; reasoning and (meta) analysis; result or conclusion, and the overall design of the evaluation.FindingsCurrently, evaluation reports are seen merely as a way to share experience and support thinking about how to avoid repeating mistakes. However, most respondents expected them to contribute to learning and support improvement. They should provide actionable feedback on what could be done differently or better, and indicate how this can be achieved. Respondents emphasised the need to focus on the human factor and not neglect the context. The wide variety of views underlined that it is difficult to create one evaluation product that meets all expectations.Research limitations/implicationsAlthough some major themes clearly emerged from the data, it is unclear how they relate to each other, and their relative importance. In addition, no distinction is made between evaluations of real events and simulations.Practical implicationsUsers should be encouraged to provide input into the evaluation process by clarifying their needs and how they use evaluation reports.Originality/valueThis research is the first attempt to identify user expectations regarding what constitutes an effective evaluation.
Subject
Management Science and Operations Research,Safety Research
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献