Author:
Mao Yanyan,Chen Chao,Wang Zhenjie,Cheng Dapeng,You Panlu,Huang Xingdan,Zhang Baosheng,Zhao Feng
Abstract
Recently, attention has been drawn toward brain imaging technology in the medical field, among which MRI plays a vital role in clinical diagnosis and lesion analysis of brain diseases. Different sequences of MR images provide more comprehensive information and help doctors to make accurate clinical diagnoses. However, their costs are particularly high. For many image-to-image synthesis methods in the medical field, supervised learning-based methods require labeled datasets, which are often difficult to obtain. Therefore, we propose an unsupervised learning-based generative adversarial network with adaptive normalization (AN-GAN) for synthesizing T2-weighted MR images from rapidly scanned diffusion-weighted imaging (DWI) MR images. In contrast to the existing methods, deep semantic information is extracted from the high-frequency information of original sequence images, which are then added to the feature map in deconvolution layers as a modality mask vector. This image fusion operation results in better feature maps and guides the training of GANs. Furthermore, to better preserve semantic information against common normalization layers, we introduce AN, a conditional normalization layer that modulates the activations using the fused feature map. Experimental results show that our method of synthesizing T2 images has a better perceptual quality and better detail than the other state-of-the-art methods.
Reference48 articles.
1. Image quality transfer via random forest regression: Applications in diffusion MRI.;Alexander;Med. Image Comput. Comput. Assist. Interv.,2014
2. Large scale GAN training for high fidelity natural image synthesis.;Andrew;arXiv,2019
3. Layer normalization.;Ba;ArXiv,2016
4. Large scale GAN training for high fidelity natural image synthesis.;Brock;ArXiv,2019
5. Sketch2photo: Internet image montage.;Chen;ACM Trans. Graph.,2009
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献