ABUS tumor segmentation via decouple contrastive knowledge distillation
-
Published:2023-12-26
Issue:1
Volume:69
Page:015019
-
ISSN:0031-9155
-
Container-title:Physics in Medicine & Biology
-
language:
-
Short-container-title:Phys. Med. Biol.
Author:
Pan Pan,Li Yanfeng,Chen Houjin,Sun Jia,Li Xiaoling,Cheng Lin
Abstract
Abstract
Objective. In recent years, deep learning-based methods have become the mainstream for medical image segmentation. Accurate segmentation of automated breast ultrasound (ABUS) tumor plays an essential role in computer-aided diagnosis. Existing deep learning models typically require a large number of computations and parameters. Approach. Aiming at this problem, we propose a novel knowledge distillation method for ABUS tumor segmentation. The tumor or non-tumor regions from different cases tend to have similar representations in the feature space. Based on this, we propose to decouple features into positive (tumor) and negative (non-tumor) pairs and design a decoupled contrastive learning method. The contrastive loss is utilized to force the student network to mimic the tumor or non-tumor features of the teacher network. In addition, we designed a ranking loss function based on ranking the distance metric in the feature space to address the problem of hard-negative mining in medical image segmentation. Main results. The effectiveness of our knowledge distillation method is evaluated on the private ABUS dataset and a public hippocampus dataset. The experimental results demonstrate that our proposed method achieves state-of-the-art performance in ABUS tumor segmentation. Notably, after distilling knowledge from the teacher network (3D U-Net), the Dice similarity coefficient (DSC) of the student network (small 3D U-Net) is improved by 7%. Moreover, the DSC of the student network (3D HR-Net) reaches 0.780, which is very close to that of the teacher network, while their parameters are only 6.8% and 12.1% of 3D U-Net, respectively. Significance. This research introduces a novel knowledge distillation method for ABUS tumor segmentation, significantly reducing computational demands while achieving state-of-the-art performance. The method promises enhanced accuracy and feasibility for computer-aided diagnosis in diverse imaging scenarios.
Funder
National Natural Science Foundation of China
Beijing Natural Science Foundation
Subject
Radiology, Nuclear Medicine and imaging,Radiological and Ultrasound Technology
Reference65 articles.
1. Dilated densely connected U-Net with uncertainty focus loss for 3D ABUS mass segmentation;Cao;Comput. Methods Programs Biomed.,2021
2. Contrastive learning of global and local features for medical image segmentation with limited annotations;Chaitanya;Advances in Neural Information Processing Systems,2020
3. A simple framework for contrastive learning of visual representations;Chen,2020
4. Exploring simple siamese representation learning;Chen,2021