Abstract
Wyner’s common information is a measure that quantifies and assesses the commonality between two random variables. Based on this, we introduce a novel two-step procedure to construct features from data, referred to as Common Information Components Analysis (CICA). The first step can be interpreted as an extraction of Wyner’s common information. The second step is a form of back-projection of the common information onto the original variables, leading to the extracted features. A free parameter γ controls the complexity of the extracted features. We establish that, in the case of Gaussian statistics, CICA precisely reduces to Canonical Correlation Analysis (CCA), where the parameter γ determines the number of CCA components that are extracted. In this sense, we establish a novel rigorous connection between information measures and CCA, and CICA is a strict generalization of the latter. It is shown that CICA has several desirable features, including a natural extension to beyond just two data sets.
Funder
Swiss National Science Foundation
Subject
General Physics and Astronomy
Reference33 articles.
1. RELATIONS BETWEEN TWO SETS OF VARIATES
2. On Wyner’s common information in the Gaussian Case;Sula;arXiv,2019
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. The Wyner Variational Autoencoder for Unsupervised Multi-Layer Wireless Fingerprinting;GLOBECOM 2023 - 2023 IEEE Global Communications Conference;2023-12-04
2. Efficient Alternating Minimization Solvers for Wyner Multi-View Unsupervised Learning;2023 IEEE International Symposium on Information Theory (ISIT);2023-06-25
3. Lower bound on relaxed Wyner's Common Information;2021 IEEE International Symposium on Information Theory (ISIT);2021-07-12