Affiliation:
1. The University of Alabama at Birmingham
Abstract
Abstract
Current clinical tools to assess neonatal pain, including various pain scales such as Neonatal Infant Pain Scale (NIPS) and Neonatal Pain, Agitation, and Sedation Scale (N-PASS), are overly reliant on nurses’ subjective observation and analysis. Emerging deep learning approaches seek to fully automate this, but face chal- lenges including massive training data and computational resources, and potential public mistrust. Our study prioritizes facial information for pain detection, as facial muscles exhibit distinct patterns during pain events. This approach, using a single camera, avoids challenges associated with multimodal methods, such as data synchronization, larger training datasets, deployment issues, and high computational costs. We propose a deep learning-based neonatal pain detection framework that can alert a neonate pain management team when a pain event occurs, consisting of two main components: a transfer learning-based end-to-end pain detection neural network, and a manual assessment branch. The proposed neural network requires much less data to train and can evaluate whether a neonate is in a pain state based on facial information only. Additionally, the man- ual assessment branch can specifically handle the borderline/hard cases where the pain detection network is less confident. The integration of both machine detection and manual evaluation can increase the recall rate of true pain events, reduce the manual evaluation effort, and increase public trust in such applications. Experimental results show our neural network sur- passes state-of-the-art algorithms by at least 25% in accuracy on the MNPAD dataset, with overall framework accuracy reaching 82.35% with integration of manual assessment branch.
Publisher
Research Square Platform LLC