Abstract
Abstract
We develop an algorithm for jointly estimating the posterior and the noise parameters in Bayesian inverse problems, which is motivated by indirect measurements and applications from nanometrology with a mixed noise model. We propose to solve the problem by an expectation maximization (EM) algorithm. Based on the current noise parameters, we learn in the E-step a conditional normalizing flow that approximates the posterior. In the M-step, we propose to find the noise parameter updates again by an EM algorithm, which has analytical formulas. We compare the training of the conditional normalizing flow with the forward and reverse Kullback–Leibler divergence, and show that our model is able to incorporate information from many measurements, unlike previous approaches.
Funder
Deutsche Forschungsgemeinschaft
European Metrology Programme for Innovation and Research
Engineering and Physical Sciences Research Council
Reference67 articles.
1. Noise flow: noise modeling with conditional normalizing flows;Abdelhamed,2019
2. WPPNets and WPPFlows: the power of wasserstein patch priors for superresolution;Altekrüger;SIAM J. Imaging Sci.,2023
3. Invertible neural networks versus MCMC for posterior reconstruction in grazing incidence x-ray fluorescence;Andrle,2021
4. The anisotropy in the optical constants of quartz crystals for soft x-rays;Andrle;J. Appl. Crystallogr.,2021
5. Annealed flow transport monte carlo;Arbel,2021