Abstract
Abstract
Optical coherence tomography (OCT) is a promising non-invasive imaging technique that owns many biomedical applications. In this paper, a deep neural network is proposed for enhancing the spatial resolution of OCT en face images. Different from the previous reports, the proposed can recover high-resolution en face images from low-resolution en face images at arbitrary imaging depth. This kind of imaging depth adaptive resolution enhancement is achieved through an external attention mechanism, which takes advantage of morphological similarity between the arbitrary-depth and full-depth en face images. Firstly, the deep feature maps are extracted by a feature extraction network from the arbitrary-depth and full-depth en face images. Secondly, the morphological similarity between the deep feature maps is extracted and utilized to emphasize the features strongly correlated to the vessel structures by using the external attention network. Finally, the SR image is recovered from the enhanced feature map through an up-sampling network. The proposed network is tested on a clinical skin OCT data set and an open-access retinal OCT dataset. The results show that the proposed external attention mechanism can suppress invalid features and enhance significant features in our tasks. For all tests, the proposed SR network outperformed the traditional image interpolation method, e.g. bi-cubic method, and the state-of-the-art image super-resolution networks, e.g. enhanced deep super-resolution network, residual channel attention network, and second-order attention network. The proposed method may increase the quantitative clinical assessment of micro-vascular diseases which is limited by OCT imaging device resolution.
Funder
National Natural Science Foundation of China
CAMS Innovation Fund for Medical Sciences
Guangdong Basic and Applied Basic Research Foundation
Subject
Radiology, Nuclear Medicine and imaging,Radiological and Ultrasound Technology
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献