Affiliation:
1. School of Mathematics, Statistics and Computer Science, University of Kwazulu-Natal, Westville, Durban 4000, South Africa
Abstract
Age and gender predictions of unfiltered faces classify unconstrained real-world facial images into predefined age and gender. Significant improvements have been made in this research area due to its usefulness in intelligent real-world applications. However, the traditional methods on the unfiltered benchmarks show their incompetency to handle large degrees of variations in those unconstrained images. More recently, Convolutional Neural Networks (CNNs) based methods have been extensively used for the classification task due to their excellent performance in facial analysis. In this work, we propose a novel end-to-end CNN approach, to achieve robust age group and gender classification of unfiltered real-world faces. The two-level CNN architecture includes feature extraction and classification itself. The feature extraction extracts feature corresponding to age and gender, while the classification classifies the face images to the correct age group and gender. Particularly, we address the large variations in the unfiltered real-world faces with a robust image preprocessing algorithm that prepares and processes those faces before being fed into the CNN model. Technically, our network is pretrained on an IMDb-WIKI with noisy labels and then fine-tuned on MORPH-II and finally on the training set of the OIU-Adience (original) dataset. The experimental results, when analyzed for classification accuracy on the same OIU-Adience benchmark, show that our model obtains the state-of-the-art performance in both age group and gender classification. It improves over the best-reported results by 16.6% (exact accuracy) and 3.2% (one-off accuracy) for age group classification and also there is an improvement of 3.0% (exact accuracy) for gender classification.
Subject
General Environmental Science,General Biochemistry, Genetics and Molecular Biology,General Medicine
Cited by
41 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献