Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition
-
Published:2022-03-20
Issue:2
Volume:26
Page:188-195
-
ISSN:1883-8014
-
Container-title:Journal of Advanced Computational Intelligence and Intelligent Informatics
-
language:en
-
Short-container-title:JACIII
Author:
Morita Kento,Shirai Nobu C.,Shinkoda Harumi,Matsumoto Asami,Noguchi Yukari,Shiramizu Masako,Wakabayashi Tetsushi, , , , , ,
Abstract
Premature babies are admitted to the neonatal intensive care unit (NICU) for several weeks and are generally placed under high medical supervision. The NICU environment is considered to have a bad influence on the formation of the sleep-wake cycle of the neonate, known as the circadian rhythm, because patient monitoring and treatment equipment emit light and noise throughout the day. In order to improve the neonatal environment, researchers have investigated the effect of light and noise on neonates. There are some methods and devices to measure neonatal alertness, but they place on additional burden on neonatal patients or nurses. Therefore, this study proposes an automatic non-contact neonatal alertness state classification method using video images. The proposed method consists of a face region of interest (ROI) location normalization method, histogram of oriented gradients (HOG) and gradient feature-based feature extraction methods, and a neonatal alertness state classification method using machine learning. Comparison experiments using 14 video images of 7 neonatal subjects showed that the weighted support vector machine (w-SVM) using the HOG feature and averaging merge achieved the highest classification performance (micro-F1 of 0.732). In clinical situations, body movement is evaluated primarily to classify waking states. The additional 4 class classification experiments are conducted by combining waking states into a single class, with results that suggest that the proposed facial expression based classification is suitable for the detailed classification of sleeping states.
Funder
Japan Society for the Promotion of Science
Publisher
Fuji Technology Press Ltd.
Subject
Artificial Intelligence,Computer Vision and Pattern Recognition,Human-Computer Interaction
Reference18 articles.
1. H. Blencowe, S. Cousens, M. Z. Oestergaard, D. Chou, A. B. Moller, R. Narwal, A. Adler, C. V. Garcia, S. Rohde, L. Say, and J. E. Lawn, “National, regional, and worldwide estimates of preterm birth rates in the year 2010 with time trends since 1990 for selected countries: a systematic analysis and implications,” Lancet, Vol.379, No.9832, pp. 2162-2172, 2012. 2. Ministry of Health, Labour and Welfare, “Vital Statistics of Japan –The latest trends–,” 2017. 3. S. Blackburn, “Environmental impact of the NICU on developmental outcomes,” J. of Pediatric Nursing, Vol.13, No.5, pp. 279-289, 1998. 4. H. Shinkoda, Y. Kinoshita, R. Mitsutake, F. Ueno, H. Arata, C. Kiyohara, Y. Suetsugu, Y. Koga, K. Anai, M. Shiramizu, M. Ochiai, and T. Kaku, “The influence of premature infants/sleep and physiological response under NICU environment (illuminance, noise) – Seen from circadian variation and comparison of day and night –,” Mie Nursing J., Vol.17, No.1, pp. 35-44, 2015 (in Japanese). 5. M. Shiramizu and H. Shinkoda, “A pilot study to examine the most suitable lighting environment for the premature infants: Analysis of amount of activity and physiological response by using Actigraph,” Mie Nursing J., Vol.18, No.1, pp. 15-21, 2016 (in Japanese).
|
|