Affiliation:
1. HautAI OÜ Tallinn Estonia
2. James L. Winkle College of Pharmacy University of Cincinnati Cincinnati Ohio USA
Abstract
AbstractPredicting a person's chronological age (CA) from visible skin features using artificial intelligence (AI) is now commonplace. Often, convolutional neural network (CNN) models are built using images of the face as biometric data. However, hands hold telltale signs of a person's age. To determine the utility of using only hand images in predicting CA, we developed two deep CNNs based on 1) dorsal hand images (H) and 2) frontal face images (F). Subjects (n = 1454) were Indian women, 20–80 years, across three geographic cohorts (Mumbai, New Delhi and Bangalore) and having a broad variation in skin tones. Images were randomised: 70% of F and 70% of H were used to train CNNs. The remaining 30% of F and H were retained for validation. CNN validation showed mean absolute error for predicting CA using F and H of 4.1 and 4.7 years, respectively. In both cases correlations of predicted and actual age were statistically significant (r(F) = 0.93, r(H) = 0.90). The CNNs for F and H were validated for dark and light skin tones. Finally, by blurring or accentuating visible features on specific regions of the hand and face, we identified those features that contributed to the CNN models. For the face, areas of the inner eye corner and around the mouth were most important for age prediction. For the hands, knuckle texture was a key driver for age prediction. Collectively, for AI estimates of CA, CNNs based solely on hand images are a viable alternative and comparable to CNNs based on facial images.