Abstract
Fecal trait examinations are critical in the clinical diagnosis of digestive diseases, and they can effectively reveal various aspects regarding the health of the digestive system. An automatic feces detection and trait recognition system based on a visual sensor could greatly alleviate the burden on medical inspectors and overcome many sanitation problems, such as infections. Unfortunately, the lack of digital medical images acquired with camera sensors due to patient privacy has obstructed the development of fecal examinations. In general, the computing power of an automatic fecal diagnosis machine or a mobile computer-aided diagnosis device is not always enough to run a deep network. Thus, a light-weight practical framework is proposed, which consists of three stages: illumination normalization, feces detection, and trait recognition. Illumination normalization effectively suppresses the illumination variances that degrade the recognition accuracy. Neither the shape nor the location is fixed, so shape-based and location-based object detection methods do not work well in this task. Meanwhile, this leads to a difficulty in labeling the images for training convolutional neural networks (CNN) in detection. Our segmentation scheme is free from training and labeling. The feces object is accurately detected with a well-designed threshold-based segmentation scheme on the selected color component to reduce the background disturbance. Finally, the preprocessed images are categorized into five classes with a light-weight shallow CNN, which is suitable for feces trait examinations in real hospital environments. The experiment results from our collected dataset demonstrate that our framework yields a satisfactory accuracy of 98.4%, while requiring low computational complexity and storage.
Funder
National Natural Science Foundation of China
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Cited by
27 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献