Affiliation:
1. Hiroshima University Graduate School of Science Department of Biosphere Science: Hiroshima Daigaku Daigakuin Togo Seimei Kagaku Kenkyuka
2. Hiroshima University: Hiroshima Daigaku
Abstract
Abstract
Purpose
To propose a style transfer model for multi-contrast magnetic resonance imaging (MRI) images with a cycle-consistent generative adversarial network (CycleGAN) and evaluate the image quality and prognosis prediction performance for glioblastoma (GBM) patients from the extracted radiomics features.
Methods
Style transfer models of T1 weighted MRI image (T1w) to T2 weighted MRI image (T2w) and T2w to T1w with CycleGAN were constructed using the BraTS dataset. The style transfer model was validated with the Cancer Genome Atlas Glioblastoma Multiforme (TCGA-GBM) dataset. Moreover, imaging features were extracted from real and synthesized images. These features were transformed to rad-scores by the least absolute shrinkage and selection operator (LASSO)-Cox regression. The prognosis performance was estimated by the Kaplan-Meier method.
Results
For the accuracy of the image quality of the real and synthesized MRI images, the MI, RMSE, PSNR, and SSIM were 0.991 ± 2.10, 2.79 ± 0.16, 40.16 ± 0.38, and 0.995 ± 2.11, for T2w, and .992 ± 2.63, 2.49 ± 6.89, 40.51 ± 0.22, and 0.993 ± 3.40 for T1w, respectively. The survival time had a significant difference between good and poor prognosis groups for both real and synthesized T2w (p<0.05). However, the survival time had no significant difference between good and poor prognosis groups for both real and synthesized T1w. On the other hand, there was no significant difference between the real and synthesized T2w in both good and poor prognoses. The results of T1w were similar in the point that there was no significant difference between the real and synthesized T1w.
Conclusions
It was found that the synthesized image could be used for prognosis prediction. The proposed prognostic model using CycleGAN could reduce the cost and time of image scanning, leading to a promotion to build the patient’s outcome prediction with multi-contrast images.
Publisher
Research Square Platform LLC