Abstract
Abstract
Purpose
To develop an automated classification system using a machine learning classifier to distinguish clinically unaffected eyes in patients with keratoconus from a normal control population based on a combination of Scheimpflug camera images and ultra-high-resolution optical coherence tomography (UHR-OCT) imaging data.
Methods
A total of 121 eyes from 121 participants were classified by 2 cornea experts into 3 groups: normal (50 eyes), with keratoconus (38 eyes) or with subclinical keratoconus (33 eyes). All eyes were imaged with a Scheimpflug camera and UHR-OCT. Corneal morphological features were extracted from the imaging data. A neural network was used to train a model based on these features to distinguish the eyes with subclinical keratoconus from normal eyes. Fisher’s score was used to rank the differentiable power of each feature. The receiver operating characteristic (ROC) curves were calculated to obtain the area under the ROC curves (AUCs).
Results
The developed classification model used to combine all features from the Scheimpflug camera and UHR-OCT dramatically improved the differentiable power to discriminate between normal eyes and eyes with subclinical keratoconus (AUC = 0.93). The variation in the thickness profile within each individual in the corneal epithelium extracted from UHR-OCT imaging ranked the highest in differentiating eyes with subclinical keratoconus from normal eyes.
Conclusion
The automated classification system using machine learning based on the combination of Scheimpflug camera data and UHR-OCT imaging data showed excellent performance in discriminating eyes with subclinical keratoconus from normal eyes. The epithelial features extracted from the OCT images were the most valuable in the discrimination process. This classification system has the potential to improve the differentiable power of subclinical keratoconus and the efficiency of keratoconus screening.
Publisher
Springer Science and Business Media LLC
Cited by
44 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献