Affiliation:
1. School of Computer Science, University of Birmingham, Edgbaston, B15 2TT Birmingham, UK
Abstract
Singular covariance matrices are frequently encountered in both machine learning and optimization problems, most commonly due to high dimensionality of data and insufficient sample sizes. Among many methods of regularization, here we focus on a relatively recent random matrix-theoretic approach, the idea of which is to create well-conditioned approximations of a singular covariance matrix and its inverse by taking the expectation of its random projections. We are interested in the error of a Monte Carlo implementation of this approach, which allows subsequent parallel processing in low dimensions in practice. We find that [Formula: see text] random projections, where [Formula: see text] is the size of the original matrix, are sufficient for the Monte Carlo error to become negligible, in the sense of expected spectral norm difference, for both covariance and inverse covariance approximation, in the latter case under mild assumptions.
Publisher
World Scientific Pub Co Pte Lt
Subject
Applied Mathematics,Analysis