Abstract
Abstract
Learning neural networks using only few available information is an important ongoing research topic with tremendous potential for applications. In this paper, we introduce a powerful regularizer for the variational modeling of inverse problems in imaging. Our regularizer, called patch normalizing flow regularizer (patchNR), involves a normalizing flow learned on small patches of very few images. In particular, the training is independent of the considered inverse problem such that the same regularizer can be applied for different forward operators acting on the same class of images. By investigating the distribution of patches versus those of the whole image class, we prove that our model is indeed a maximum a posteriori approach. Numerical examples for low-dose and limited-angle computed tomography (CT) as well as superresolution of material images demonstrate that our method provides very high quality results. The training set consists of just six images for CT and one image for superresolution. Finally, we combine our patchNR with ideas from internal learning for performing superresolution of natural images directly from the low-resolution observation without knowledge of any high-resolution image.
Funder
Klaus Tschira Stiftung
Deutsche Forschungsgemeinschaft
Berlin Mathematics Research Center MATH+
Subject
Applied Mathematics,Computer Science Applications,Mathematical Physics,Signal Processing,Theoretical Computer Science
Reference93 articles.
1. Operator discretization library (ODL);Adler,2018
2. Learned primal-dual reconstruction;Adler;IEEE Trans. Med. Imaging,2018
3. WPPNets and WPPFlows: the power of Wasserstein patch priors for superresolution;Altekrüger,2022
4. An unsupervised approach to solving inverse problems using generative adversarial networks;Anirudh,2018
5. Analyzing inverse problems with invertible neural networks;Ardizzone,2019
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献