Affiliation:
1. İSKENDERUN TEKNİK ÜNİVERSİTESİ
Abstract
Increasing the representation power of convolutional neural networks, a popular deep learning model, is one of the hot study topics recently. Channel attention is a common strategy followed in this regard. In this strategy, the inter-channel relationship is exploited by a module placed after the convolution operation. Recently, successful channel attention modules are proposed in this context. In this article, a performance analysis of three popular channel attention structures which are Squeeze-and-Excitation Networks (SeNet), Efficient Channel Attention Networks (Eca-Net), and Convolutional Block Attention Module (CBAM), is performed using five different image datasets for the classification task. According to the obtained results, SeNet is the most successful channel attention module surpassing the other’s performance in the majority of the experiments. In experiments with the ResNet18 and ResNet34 base models, the SeNet module showed the highest performance in three of the five datasets. For the ResNet50 baseline, SeNet was the channel attention module with the highest accuracy values for all datasets.
Publisher
Cukurova Universitesi Muhendislik-Mimarlik Fakultesi Dergisi