Affiliation:
1. Biomolecular Dynamics, Institute of Physics, University of Freiburg , 79104 Freiburg, Germany
Abstract
While the linear Pearson correlation coefficient represents a well-established normalized measure to quantify the inter-relation of two stochastic variables X and Y, it fails for multidimensional variables, such as Cartesian coordinates. Avoiding any assumption about the underlying data, the mutual information I(X, Y) does account for multidimensional correlations. However, unlike the normalized Pearson correlation, it has no upper bound (I ∈ [0, ∞)), i.e., it is not clear if say, I = 0.4 corresponds to a low or a high correlation. Moreover, the mutual information (MI) involves the estimation of high-dimensional probability densities (e.g., six-dimensional for Cartesian coordinates), which requires a k nearest-neighbor algorithm, such as the estimator by Kraskov et al. [Phys. Rev. E 69, 066138 (2004)]. As existing methods to normalize the MI cannot be used in connection with this estimator, a new approach is presented, which uses an entropy estimation method that is invariant under variable transformations. The algorithm is numerically efficient and does not require more effort than the calculation of the (un-normalized) MI. After validating the method by applying it to various toy models, the normalized MI between the Cα-coordinates of T4 lysozyme is considered and compared to a correlation analysis of inter-residue contacts.
Funder
Deutsche Forschungsgemeinschaft