Author:
Hagemann Paul Lyonel,Hertrich Johannes,Steidl Gabriele
Abstract
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
Publisher
Cambridge University Press
Reference91 articles.
1. Altekrüger, F. , Denker, A. , Hagemann, P. et al. 2022. PatchNR: Learning from Small Data by Patch Normalizing Flow Regularization. arXiv:2205.12021.
2. Riemann Manifold Langevin and Hamiltonian Monte Carlo Methods;Girolami;Journal of the Royal Statistical Society: Series B (Statistical Methodology),2011
3. Masked Autoregressive Flow for Density Estimation;Papamakarios;Advances in Neural Information Processing Systems,2017
4. Learning Structured Output Representation Using Deep Conditional Generative Models;Sohn;Advances in Neural Information Processing Systems,2015
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献