Affiliation:
1. Stuttgart Center for Simulation Science, Cluster of Excellence EXC 2075, University of Stuttgart, 70569 Stuttgart, Germany
2. Hydrology and Atmospheric Sciences, The University of Arizona, Tucson, AZ 85721, USA
3. Institute of Water and River Basin Management, Karlsruhe Institute of Technology (KIT), 76131 Karlsruhe, Germany
Abstract
Using information-theoretic quantities in practical applications with continuous data is often hindered by the fact that probability density functions need to be estimated in higher dimensions, which can become unreliable or even computationally unfeasible. To make these useful quantities more accessible, alternative approaches such as binned frequencies using histograms and k-nearest neighbors (k-NN) have been proposed. However, a systematic comparison of the applicability of these methods has been lacking. We wish to fill this gap by comparing kernel-density-based estimation (KDE) with these two alternatives in carefully designed synthetic test cases. Specifically, we wish to estimate the information-theoretic quantities: entropy, Kullback–Leibler divergence, and mutual information, from sample data. As a reference, the results are compared to closed-form solutions or numerical integrals. We generate samples from distributions of various shapes in dimensions ranging from one to ten. We evaluate the estimators’ performance as a function of sample size, distribution characteristics, and chosen hyperparameters. We further compare the required computation time and specific implementation challenges. Notably, k-NN estimation tends to outperform other methods, considering algorithmic implementation, computational efficiency, and estimation accuracy, especially with sufficient data. This study provides valuable insights into the strengths and limitations of the different estimation methods for information-theoretic quantities. It also highlights the significance of considering the characteristics of the data, as well as the targeted information-theoretic quantity when selecting an appropriate estimation technique. These findings will assist scientists and practitioners in choosing the most suitable method, considering their specific application and available data. We have collected the compared estimation methods in a ready-to-use open-source Python 3 toolbox and, thereby, hope to promote the use of information-theoretic quantities by researchers and practitioners to evaluate the information in data and models in various disciplines.
Funder
Deutsche Forschungsgemeinschaft
Reference50 articles.
1. Cover, T.M., and Thomas, J.A. (2006). Elements of Information Theory, Wiley. [2nd ed.].
2. A Mathematical Theory of Communication;Shannon;Bell Syst. Tech. J.,1948
3. MacKay, D.J.C. (2003). Information Theory, Inference, and Learning Algorithms, Cambridge University Press.
4. Applying Information Theory in the Geosciences to Quantify Process Uncertainty, Feedback, Scale;Ruddell;Eos Trans. Am. Geophys. Union,2013
5. Nowak, W., and Guthke, A. (2016). Entropy-Based Experimental Design for Optimal Model Discrimination in the Geosciences. Entropy, 18.