Abstract
Abstract
Objective. Self-supervised learning methods have been successfully applied for low-dose computed tomography (LDCT) denoising, with the advantage of not requiring labeled data. Conventional self-supervised methods operate only in the image domain, ignoring valuable priors in the sinogram domain. Recently proposed dual-domain methods address this limitation but encounter issues with blurring artifacts in the reconstructed image due to the inhomogeneous distribution of noise levels in low-dose sinograms. Approach. To tackle this challenge, this paper proposes SDBDNet, an end-to-end dual-domain self-supervised method for LDCT denoising. With the network designed based on the properties of inhomogeneous noise in low-dose sinograms and the principle of moderate sinogram-domain denoising, SDBDNet achieves effective denoising in dual domains without introducing blurring artifacts. Specifically, we split the sinogram into two subsets based on the positions of detector cells to generate paired training data with high similarity and independent noise. These sub-sinograms are then restored to their original size using 1D interpolation and learning-based correction. To achieve adaptive and moderate smoothing in the sinogram domain, we integrate Dropblock, a type of convolution layer with regularization, into SDBDNet, and set a weighted average between the denoised sinograms and their noisy counterparts, leading to a well-balanced dual-domain approach. Main results. Numerical experiments show that our method outperforms popular non-learning and self-supervised learning methods, demonstrating its effectiveness and superior performance. Significance. While introducing a novel high-performance dual-domain self-supervised LDCT denoising method, this paper also emphasizes and verifies the importance of appropriate sinogram-domain denoising in dual-domain methods, which might inspire future work.
Funder
Beijing Natural Science Foundation
China Scholarship Council
National Natural Science Foundation of China
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献