Abstract
The use of a mathematical model is proposed in order to denoise X-ray two-dimensional patterns. The method relies on a generalized diffusion equation whose diffusion constant depends on the image gradients. The numerical solution of the diffusion equation provides an efficient reduction of pattern noise as witnessed by the computed peak of signal-to-noise ratio. The use of experimental data with different inherent levels of noise allows us to show the success of the method even in the case, experimentally relevant, when patterns are blurred by Poissonian noise. The corresponding MatLab code for the numerical method is made available.
Subject
General Materials Science
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献