Abstract
Abstract
Background
Cine Displacement Encoding with Stimulated Echoes (DENSE) facilitates the quantification of myocardial deformation, by encoding tissue displacements in the cardiovascular magnetic resonance (CMR) image phase, from which myocardial strain can be estimated with high accuracy and reproducibility. Current methods for analyzing DENSE images still heavily rely on user input, making this process time-consuming and subject to inter-observer variability. The present study sought to develop a spatio-temporal deep learning model for segmentation of the left-ventricular (LV) myocardium, as spatial networks often fail due to contrast-related properties of DENSE images.
Methods
2D + time nnU-Net-based models have been trained to segment the LV myocardium from DENSE magnitude data in short- and long-axis images. A dataset of 360 short-axis and 124 long-axis slices was used to train the networks, from a combination of healthy subjects and patients with various conditions (hypertrophic and dilated cardiomyopathy, myocardial infarction, myocarditis). Segmentation performance was evaluated using ground-truth manual labels, and a strain analysis using conventional methods was performed to assess strain agreement with manual segmentation. Additional validation was performed using an externally acquired dataset to compare the inter- and intra-scanner reproducibility with respect to conventional methods.
Results
Spatio-temporal models gave consistent segmentation performance throughout the cine sequence, while 2D architectures often failed to segment end-diastolic frames due to the limited blood-to-myocardium contrast. Our models achieved a DICE score of 0.83 ± 0.05 and a Hausdorff distance of 4.0 ± 1.1 mm for short-axis segmentation, and 0.82 ± 0.03 and 7.9 ± 3.9 mm respectively for long-axis segmentations. Strain measurements obtained from automatically estimated myocardial contours showed good to excellent agreement with manual pipelines, and remained within the limits of inter-user variability estimated in previous studies.
Conclusion
Spatio-temporal deep learning shows increased robustness for the segmentation of cine DENSE images. It provides excellent agreement with manual segmentation for strain extraction. Deep learning will facilitate the analysis of DENSE data, bringing it one step closer to clinical routine.
Funder
EPSRC Centre for Doctoral Training in Medical Imaging
Siemens Healthineers
British Heart Foundation
National Institute for Health and Care Research
Centre For Medical Engineering, King’s College London
Publisher
Springer Science and Business Media LLC
Subject
Cardiology and Cardiovascular Medicine,Radiology, Nuclear Medicine and imaging,Radiological and Ultrasound Technology
Reference64 articles.
1. Aletras AH, Ding S, Balaban RS, Wen H. DENSE: displacement encoding with stimulated echoes in cardiac functional MRI. J Magn Reson. 1999;137(1):247–52. https://doi.org/10.1006/jmre.1998.1676.
2. Amzulescu MS, De Craene M, Langet H, Pasquet A, Vancraeynest D, Pouleur AC, Vanoverschelde JL, Gerber BL. Myocardial strain imaging: review of general principles, validation, and sources of discrepancies. Eur Heart J Cardiovasc Imaging. 2019;20(6):605–19. https://doi.org/10.1093/ehjci/jez041.
3. Auger DA, Ghadimi S, Cai X, Reagan CE, Sun C, Abdi M, Cao JJ, et al. Reproducibility of global and segmental myocardial strain using cine DENSE at 3 T: a multicenter cardiovascular magnetic resonance study in healthy subjects and patients with heart disease. J Cardiovasc Magn Reson. 2022;24(1):23. https://doi.org/10.1186/S12968-022-00851-7/FIGURES/6.
4. Bello GA, Dawes TJW, Duan J, Biffi C, de Marvao A, Luke SGE, Howard JS, Gibbs R, et al. Deep-learning cardiac motion analysis for human survival prediction. Nat Mach Intell. 2019;1(2):95–104. https://doi.org/10.1038/s42256-019-0019-2.
5. BHF Health Intelligence Team. Global heart & circulatory diseases factsheet. 2022. https://www.bhf.org.uk/-/media/files/research/heart-statistics/bhf-cvd-statistics-global-factsheet.pdf. Accessed 15 Feb 2023.
Cited by
10 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献