Affiliation:
1. Toronto Research Centre Defence Research and Development Canada Toronto Ontario Canada
2. Department of Psychology York University Toronto Ontario Canada
Abstract
AbstractExperts are expected to make well‐calibrated judgments within their field, yet a voluminous literature demonstrates miscalibration in human judgment. Calibration training aimed at improving subsequent calibration performance offers a potential solution. We tested the effect of commercial calibration training on a group of 70 intelligence analysts by comparing the miscalibration and bias of their judgments before and after a commercial training course meant to improve calibration across interval estimation and binary choice tasks. Training significantly improved calibration and bias overall, but this effect was contingent on the task. For interval estimation, analysts were overconfident before training and became better calibrated after training. For the binary choice task, however, analysts were initially underconfident and bias increased in this same direction post‐training. Improvement on the two tasks was also uncorrelated. Taken together, results indicate that the training shifted analyst bias toward less confidence rather than having improved metacognitive monitoring ability.