Affiliation:
1. School of Geosciences, Yangtze University, Wuhan, Hubei, China
2. State Key Laboratory of Geo-Information Engineering, Xi’an, Shaanxi, China
Abstract
The goal of research in Facial Expression Recognition (FER) is to build a robust and strong recognizability model. In this paper, we propose a new scheme for FER systems based on convolutional neural network. Part of the regular convolution operation is replaced by depthwise separable convolution to reduce the number of parameters and the computational workload; the self-adaption joint loss function is adopted to improve the classification performance. In addition, we balance our train set through data augmentation, and we preprocess the input images through illumination processing, face detection, and other methods, effectively maximizing the expression recognition rate. Experiments to validate our methods are conducted based on the TensorFlow platform and Fer2013 dataset. We analyze the experimental results before and after train set balancing and network model modification, and we compare our results with those of other researchers. The results show that our method is effective at increasing the expression recognition rate under the same experiment conditions. We further conduct an experiment on our own expression dataset relevant to driving safety, and it yields similar results.
Subject
Artificial Intelligence,Computer Vision and Pattern Recognition,Theoretical Computer Science
Reference31 articles.
1. Communication without words;Mehrabian;Psychology Today,1968
2. T. Chang, G. Wen, Y. Hu and J.J. Ma, Facial expression recognition based on complexity perception classification algorithm, https://arxiv.org/abs/1803. 00185, 2018.
3. Facial expression recognition using salient features and convolutional neural network;Uddin;IEEE Access,2017
4. Facial expression recognition with Convolutional Neural Networks: Coping with few data and the training sample order;Lopes;Pattern Recognition,2017
5. Supervised committee of convolutional neural networks in automated facial expression analysis;Pons;IEEE Transactions on Affective Computing,2017
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献