Abstract
The human tongue has been long believed to be a window to provide important insights into a patient’s health in medicine. The present study introduced a novel approach to predict patient age, gender, and weight inferences based on tongue images using pretrained deep convolutional neural networks (CNNs). Our results demonstrated that the deep CNN models (e.g., ResNeXt) trained on dorsal tongue images produced excellent results for age prediction with a Pearson correlation coefficient of 0.71 and a mean absolute error (MAE) of 8.5 years. We also obtained an excellent classification of gender, with a mean accuracy of 80% and an AUC (area under the receiver operating characteristic curve) of 88%. ResNeXt model also obtained a moderate level of accuracy for weight prediction, with a Pearson correlation coefficient of 0.39 and a MAE of 9.06 kg. These findings support our hypothesis that the human tongue contains crucial information about a patient. This study demonstrated the feasibility of using the pretrained deep CNNs along with a large tongue image dataset to develop computational models to predict patient medical conditions for noninvasive, convenient, and inexpensive patient health monitoring and diagnosis.
Funder
Key Research and Development Program of Zhejiang Province