Py-Feat: Python Facial Expression Analysis Toolbox
-
Published:2023-08-08
Issue:4
Volume:4
Page:781-796
-
ISSN:2662-2041
-
Container-title:Affective Science
-
language:en
-
Short-container-title:Affec Sci
Author:
Cheong Jin Hyun, Jolly Eshin, Xie Tiankang, Byrne Sophie, Kenney Matthew, Chang Luke J.ORCID
Abstract
AbstractStudying facial expressions is a notoriously difficult endeavor. Recent advances in the field of affective computing have yielded impressive progress in automatically detecting facial expressions from pictures and videos. However, much of this work has yet to be widely disseminated in social science domains such as psychology. Current state-of-the-art models require considerable domain expertise that is not traditionally incorporated into social science training programs. Furthermore, there is a notable absence of user-friendly and open-source software that provides a comprehensive set of tools and functions that support facial expression research. In this paper, we introduce Py-Feat, an open-source Python toolbox that provides support for detecting, preprocessing, analyzing, and visualizing facial expression data. Py-Feat makes it easy for domain experts to disseminate and benchmark computer vision models and also for end users to quickly process, analyze, and visualize face expression data. We hope this platform will facilitate increased use of facial expression data in human behavior research.
Funder
National Institute of Mental Health National Science Foundation
Publisher
Springer Science and Business Media LLC
Subject
General Earth and Planetary Sciences,General Environmental Science
Reference107 articles.
1. Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Devin, M., Ghemawat, S., Irving, G., Isard, M., Kudlur, M., Levenberg, J., Monga, R., Moore, S., Murray, D. G., Steiner, B., Tucker, P., Vasudevan, V., Warden, P., Wicke, M., Yu, Y., Zheng X. (2016) Tensorflow: A system for large-scale machine learning. in 12th ${USENIX} symposium on operating systems design and implementation ({OSDI}$ 16) 265–283. Savannah 2. Abraham,A., Pedregosa, F., Eickenberg, M., Gervais, P., Mueller, A., Kossaifi, J., Gramfort, A., Thirion, B., Varoquaux, G. (2014). Machine learning for neuroimaging with scikit-learn. Frontiers in Neuroinformatics, 8, 14. 3. Baltrušaitis, T., Mahmoud, M. & Robinson, P. (2015) Cross-dataset learning and person-specific normalisation for automatic Action Unit detection. In 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) 06, 1–6. https://doi.org/10.1109/FG.2015.7284869 4. Baltrusaitis, T., Zadeh, A., Lim, Y. C. & Morency, L. (2018) OpenFace 2.0: Facial behavior analysis toolkit. In 2018 13th IEEE International Conference on Automatic Face Gesture Recognition (FG 2018) 59–66. https://doi.org/10.1109/FG.2018.00019 5. Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M. & Pollak, S. D. (2019) Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological Science in the Public Interest, 20, 1–68 https://doi.org/10.1177/1529100619832930
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|