Affiliation:
1. Sharif University of Technology; Tehran University of Medical Science, Tehran, Iran
2. Sharif University of Technalogy, Tehran, Iran
Abstract
Assistant robots are widely used in laparoscopic surgery to facilitate the camera holding and manipulation task. A variety of a hands-free operator interfaces have been implemented for user control of the robots, including voice commands, foot pedals, and eye and head motion tracking systems. This paper proposes a novel user control interface, based on processing of the laparoscopic images, that enables the robot to automatically adjust the view of the laparoscopic camera without disturbing the surgeon’s concentration. An effective marker-free detection method was investigated to track the instrument position in the laparoscopic images in real time so that the robot could center the instrument tip in the camera view. Considering several available methods it was found that a color space analysis, based on the quantitative comparison of the background’s and instrument’s pixels color contexts, provides the best results. The color contexts were presented in covariance matrix and mean values and analyzed using Mahalanobis distance measure in RGB color space. Tests on laparoscopic images with controlled conditions, e.g., sufficient light and low noises, revealed 86 percent correct detection with a processing rate of 3.7 frames per second on a conventional PC. Further work is going on to improve the algorithm.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献