Author:
Sanjay S Tippannavar ,Yashwanth S D ,Puneeth K M ,Madhu Sudan M P ,Chandrashekar Murthy B N ,Eshwari A Madappa
Abstract
It's really difficult to tell what a person is feeling simply by glancing at their face or their behaviour. A basic human quality, the ability to decipher nonverbal clues from body language and facial expressions is essential for social and everyday communication. People use voice, gestures, and emotions to communicate with one another. Thus, there is a high need in various industries for systems that can identify the same. In terms of artificial intelligence, if a computer can recognize and interpret human emotions, it will be much easier for it to engage with people. A number of methods have been proposed in the past for evaluating human emotion. The traditional techniques essentially use visual and auditory cues to simulate human emotional reactions, including speech, body language, and facial expressions. Characterizing emotional states by physiological reactions has garnered more attention in recent times. Rapid advances in technology should make it possible for complex and perceptive HCI (human-computer interaction) systems to consider emotional states of humans during interactions, promoting empathy between humans and machines. Intelligent Human-Computer Interaction (HCI) applications, including virtual reality, video games, and educational systems, need the ability to recognize emotions. In the medical domain, feelings that people identify with each other could be a sign of specific functional issues, such as severe depression. The primary goal of this review is to examine alternative methods for identifying emotions using five distinct approaches, rank and explain the best methods along with their benefits, and provide commentary. In an effort to improve human-computer interactions, this article intends to be a resource for all academics and students researching in the field of emotion detection.
Publisher
Inventive Research Organization
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献