Author:
Curti Nico,Merli Yuri,Zengarini Corrado,Starace Michela,Rapparini Luca,Marcelli Emanuela,Carlini Gianluca,Buschi Daniele,Castellani Gastone C.,Piraccini Bianca Maria,Bianchi Tommaso,Giampieri Enrico
Abstract
AbstractMany automated approaches have been proposed in literature to quantify clinically relevant wound features based on image processing analysis, aiming at removing human subjectivity and accelerate clinical practice. In this work we present a fully automated image processing pipeline leveraging deep learning and a large wound segmentation dataset to perform wound detection and following prediction of the Photographic Wound Assessment Tool (PWAT), automatizing the clinical judgement of the adequate wound healing. Starting from images acquired by smartphone cameras, a series of textural and morphological features are extracted from the wound areas, aiming to mimic the typical clinical considerations for wound assessment. The resulting extracted features can be easily interpreted by the clinician and allow a quantitative estimation of the PWAT scores. The features extracted from the region-of-interests detected by our pre-trained neural network model correctly predict the PWAT scale values with a Spearman's correlation coefficient of 0.85 on a set of unseen images. The obtained results agree with the current state-of-the-art and provide a benchmark for future artificial intelligence applications in this research field.
Funder
Alma Mater Studiorum - Università di Bologna
Publisher
Springer Science and Business Media LLC