Abstract
PurposeOne of the contributions of artificial intelligent (AI) in modern technology is emotion recognition which is mostly based on facial expression and modification of its inference engine. The facial recognition scheme is mostly built to understand user expression in an online business webpage on a marketing site but has limited abilities to recognise elusive expressions. The basic emotions are expressed when interrelating and socialising with other personnel online. At most times, studying how to understand user expression is often a most tedious task, especially the subtle expressions. An emotion recognition system can be used to optimise and reduce complexity in understanding users' subconscious thoughts and reasoning through their pupil changes.Design/methodology/approachThis paper demonstrates the use of personal computer (PC) webcam to read in eye movement data that includes pupil changes as part of distinct user attributes. A custom eye movement algorithm (CEMA) is used to capture users' activity and record the data which is served as an input model to an inference engine (artificial neural network (ANN)) that helps to predict user emotional response conveyed as emoticons on the webpage.FindingsThe result from the error in performance shows that ANN is most adaptable to user behaviour prediction and can be used for the system's modification paradigm.Research limitations/implicationsOne of the drawbacks of the analytical tool is its inability in some cases to set some of the emoticons within the boundaries of the visual field, this is a limitation to be tackled within subsequent runs with standard techniques.Originality/valueThe originality of the proposed model is its ability to predict basic user emotional response based on changes in pupil size between average recorded baseline boundaries and convey the emoticons chronologically with the gaze points.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献