Author:
Hayashi Yuichiro,Fujii Masazumi,Kajita Yasukazu,Wakabayashi Toshihiko,Mori Kensaku
Abstract
In this paper, we introduce a new concept of surgical navigation which processes information interactively between the real and virtual spaces, namely, updating preoperative images using the positional information of surgical tools. Although the organs are deformed by operative procedures during surgery, surgical navigation systems usually do not change the reference images that are taken prior to surgery. It is useful to generate deformed reference images during surgery while it progresses. We develop a skull base surgery navigation system that updates the preoperative images during surgery. To estimate the resected regions, our proposed system utilizes the positional information of the surgical tools that can be tracked by a surgical navigation system. Our proposed system reflects the bone removal on preoperative images by changing the voxel values of the preoperative images using the positional information of the tracked tools. The updated reference images are generated by visualizing the updated preoperative images using a volume rendering method. We evaluated the proposed system on a skull phantom created from CT images by a 3D printer. The experimental results showed that the proposed system updated the reference images in real time based on the surgical tasks including bone removal process. The accuracy of our proposed method was about 1 mm. It is very useful for surgeons to drill into such complex bone structure as the skull base.
Publisher
NumFOCUS - Insight Software Consortium (ITK)