Affiliation:
1. Shenyang Ligong University, Shenyang 110168, China
2. Key Laboratory of Information Network and Information Countermeasure Technology of Liaoning Province, Shenyang Ligong University, Shenyang 110168, China
Abstract
Current deep learning-based facial expression recognition mainly focused on the six basic human emotions and relied on large-scale and well-annotated data. For complex emotion recognition, such a large amount of data are not easy to obtain, and a high-quality annotation is even more difficult. Therefore, in this paper, we regard complex emotion recognition via facial expressions as a few-shot learning problem and introduce a metric-based few-shot model named self-cure relation networks (SCRNet), which is robust to label noises and is able to classify facial images of new classes of emotions by only few examples from each. Specifically, SCRNet learns a distance metric based on deep features abstracted by convolutional neural networks and predicts a query image’s emotion category by computing relation scores between the query image and the few examples of each new class. To tackle the label noise problem, SCRNet gives corrected labels to noise data via class prototype stored in external memory during the meta-training phase. Experimenting on public datasets as well as on synthetic noise datasets demonstrates the effectiveness of our method.
Funder
Shenyang Ligong University
Subject
General Mathematics,General Medicine,General Neuroscience,General Computer Science
Reference33 articles.
1. The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression;P. Lucey
2. Web-based database for facial expression analysis;M. Pantic
3. Video and image based emotion recognition challenges in the wild: EmotiW;A. Dhall
4. Challenges in representation learning: a report on three machine learning contests;I. J. Goodfellow,2013
5. Universals and cultural differences in the judgments of facial expressions of emotion.
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献