Affiliation:
1. Foundation for Neural Networks, University of Nijmegen, 6525 EZ Nijmegen, The Netherlands
Abstract
The bias/variance decomposition of mean-squared error is well understood and relatively straightforward. In this note, a similar simple decomposition is derived, valid for any kind of error measure that, when using the appropriate probability model, can be derived from a Kullback-Leibler divergence or log-likelihood.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
63 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献