Abstract
Background/Aim: Vesicoureteral reflux (VUR) is a condition that causes urine to flow in reverse, from the bladder back into the ureters and occasionally into the kidneys. It becomes a vital cause of urinary tract infections. Conventionally, VUR’s severity is evaluated through imaging via voiding cystourethrography (VCUG). However, there is an unresolved debate regarding the precise timing and type of surgery required, making it crucial to classify VUR grades uniformly and accurately. This study’s primary purpose is to leverage machine learning, particularly convolutional neural network (CNN), to effectively identify and classify VUR in VCUG images. The aspiration is to diminish classification discrepancies between different observers and to create an accessible tool for healthcare practitioners.
Methods: We utilized a dataset of 59 VCUG images with diagnosed VUR sourced from OpenI. These images were independently classified by two seasoned urologists according to the International Reflux Classification System. We utilized TensorFlow, Keras, and Jupyter Notebook for data preparation, segmentation, and model building. The CNN Inception V3 was employed for transfer learning, while data augmentation was used to improve the model’s resilience.
Results: The deep-learning model attained exceptional accuracy rates of 95% and 100% in validation and training, respectively, after six cycles. It effectively categorized VUR grades corresponding to the global classification system. Matplotlib tracked loss and accuracy values, while Python-based statistical analysis assessed the model’s performance using the F1-score.
Conclusion: The study’s model effectively categorized images, including those of vesicoureteral reflux, which has significant implications for treatment decisions. The application of this artificial intelligence model may help reduce interobserver bias. Additionally, it could offer an objective method for surgical planning and treatment outcomes.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献