Extra-abdominal trocar and instrument detection for enhanced surgical workflow understanding
-
Published:2024-07-15
Issue:
Volume:
Page:
-
ISSN:1861-6429
-
Container-title:International Journal of Computer Assisted Radiology and Surgery
-
language:en
-
Short-container-title:Int J CARS
Author:
Jurosch FranziskaORCID, Wagner LarsORCID, Jell AlissaORCID, Islertas Esra, Wilhelm Dirk, Berlet MaximilianORCID
Abstract
Abstract
Purpose
Video-based intra-abdominal instrument tracking for laparoscopic surgeries is a common research area. However, the tracking can only be done with instruments that are actually visible in the laparoscopic image. By using extra-abdominal cameras to detect trocars and classify their occupancy state, additional information about the instrument location, whether an instrument is still in the abdomen or not, can be obtained. This can enhance laparoscopic workflow understanding and enrich already existing intra-abdominal solutions.
Methods
A data set of four laparoscopic surgeries recorded with two time-synchronized extra-abdominal 2D cameras was generated. The preprocessed and annotated data were used to train a deep learning-based network architecture consisting of a trocar detection, a centroid tracker and a temporal model to provide the occupancy state of all trocars during the surgery.
Results
The trocar detection model achieves an F1 score of $$95.06\pm 0.88\%$$
95.06
±
0.88
%
. The prediction of the occupancy state yields an F1 score of $$89.29\pm 5.29\%$$
89.29
±
5.29
%
, providing a first step towards enhanced surgical workflow understanding.
Conclusion
The current method shows promising results for the extra-abdominal tracking of trocars and their occupancy state. Future advancements include the enlargement of the data set and incorporation of intra-abdominal imaging to facilitate accurate assignment of instruments to trocars.
Funder
Bundesministerium für Bildung und Forschung Bayerische Staatsministerium für Wirtschaft, Landesentwicklung und Energie
Publisher
Springer Science and Business Media LLC
Reference18 articles.
1. Maier-Hein L, Vedula SS, Speidel S, Navab N, Kikinis R, Park A, Eisenmann M, Feussner H, Forestier G, Giannarou S, Hashizume M, Katic D, Kenngott H, Kranzfelder M, Malpani A, März K, Neumuth T, Padoy N, Pugh C, Schoch N, Stoyanov D, Taylor R, Wagner M, Hager GD, Jannin P (2017) Surgical data science for next-generation interventions. Nat Biomed Eng 1(9):691–696 2. Demir KC, Schieber H, Weise T, Roth D, May M, Maier A, Yang SH (2023) Deep learning in surgical workflow analysis: a review of phase and step recognition. IEEE J Biomed Health Inform 3. Valderrama N, Ruiz Puentes P, Hernández I, Ayobi N, Verlyck M, Santander J, Caicedo J, Fernández N, Arbeláez P (2022) Towards holistic surgical scene understanding. In: Wang L, Dou Q, Fletcher PT, Speidel S, Li S (eds) Medical image computing and computer assisted intervention—MICCAI 2022. Springer, Cham, pp 442–452 4. Roß T, Reinke A, Full PM, Wagner M, Kenngott H, Apitz M, Hempe H, Mindroc-Filimon D, Scholz P, Tran TN, Bruno P, Arbeláez P, Bian G-B, Bodenstedt S, Bolmgren JL, Bravo-Sánchez L, Chen H-B, González C, Guo D, Halvorsen P, Heng P-A, Hosgor E, Hou Z-G, Isensee F, Jha D, Jiang T, Jin Y, Kirtac K, Kletz S, Leger S, Li Z, Maier-Hein KH, Ni Z-L, Riegler MA, Schoeffmann K, Shi R, Speidel S, Stenzel M, Twick I, Wang G, Wang J, Wang L, Wang L, Zhang Y, Zhou Y-J, Zhu L, Wiesenfarth M, Kopp-Schneider A, Müller-Stich BP, Maier-Hein L (2021) Comparative validation of multi-instance instrument segmentation in endoscopy: results of the robust-mis 2019 challenge. Med Image Anal 70:101920 5. Wang Y, Sun Q, Liu Z, Gu L (2022) Visual detection and tracking algorithms for minimally invasive surgical instruments: a comprehensive review of the state-of-the-art. Robot Auton Syst 149:103945. https://doi.org/10.1016/j.robot.2021.103945
|
|