Abstract
Abstract
Purpose
In robotic-assisted minimally invasive surgery, surgeons often use intra-operative ultrasound to visualise endophytic structures and localise resection margins. This must be performed by a highly skilled surgeon. Automating this subtask may reduce the cognitive load for the surgeon and improve patient outcomes.
Methods
We demonstrate vision-based shape sensing of the pneumatically attachable flexible (PAF) rail by using colour-dependent image segmentation. The shape-sensing framework is evaluated on known curves ranging from $$r = 30$$
r
=
30
to $$r = 110$$
r
=
110
mm, replicating curvatures in a human kidney. The shape sensing is then used to inform path planning of a collaborative robot arm paired with an intra-operative ultrasound probe. We execute 15 autonomous ultrasound scans of a tumour-embedded kidney phantom and retrieve viable ultrasound images, as well as seven freehand ultrasound scans for comparison.
Results
The vision-based sensor is shown to have comparable sensing accuracy with FBGS-based systems. We find the RMSE of the vision-based shape sensing of the PAF rail compared with ground truth to be $$0.4975 \pm 0.4169$$
0.4975
±
0.4169
mm. The ultrasound images acquired by the robot and by the human were evaluated by two independent clinicians. The median score across all criteria for both readers was ‘3—good’ for human and ‘4—very good’ for robot.
Conclusion
We have proposed a framework for autonomous intra-operative US scanning using vision-based shape sensing to inform path planning. Ultrasound images were evaluated by clinicians for sharpness of image, clarity of structures visible, and contrast of solid and fluid areas. Clinicians evaluated that robot-acquired images were superior to human-acquired images in all metrics. Future work will translate the framework to a da Vinci surgical robot.
Funder
Wellcome / EPSRC Centre for Interventional and Surgical Sciences
Engineering and Physical Sciences Research Council
Royal Academy of Engineering Chair in Emerging Technologies Scheme
Publisher
Springer Science and Business Media LLC