Attention-Aware Patch-Based CNN for Blind 360-Degree Image Quality Assessment
Author:
Sendjasni Abderrezzaq1ORCID, Larabi Mohamed-Chaker1ORCID
Affiliation:
1. CNRS, Université de Poitiers, XLIM, UMR 7252, 86073 Poitiers, France
Abstract
An attention-aware patch-based deep-learning model for a blind 360-degree image quality assessment (360-IQA) is introduced in this paper. It employs spatial attention mechanisms to focus on spatially significant features, in addition to short skip connections to align them. A long skip connection is adopted to allow features from the earliest layers to be used at the final level. Patches are properly sampled on the sphere to correspond to the viewports displayed to the user using head-mounted displays. The sampling incorporates the relevance of patches by considering (i) the exploration behavior and (ii) a latitude-based selection. An adaptive strategy is applied to improve the pooling of local patch qualities to global image quality. This includes an outlier score rejection step relying on the standard deviation of the obtained scores to consider the agreement, as well as a saliency to weigh them based on their visual significance. Experiments on available 360-IQA databases show that our model outperforms the state of the art in terms of accuracy and generalization ability. This is valid for general deep-learning-based models, multichannel models, and natural scene statistic-based models. Furthermore, when compared to multichannel models, the computational complexity is significantly reduced. Finally, an extensive ablation study gives insights into the efficacy of each component of the proposed model.
Funder
Nouvelle Aquitaine research council
Subject
Electrical and Electronic Engineering,Biochemistry,Instrumentation,Atomic and Molecular Physics, and Optics,Analytical Chemistry
Reference68 articles.
1. Perkis, A., Timmerer, C., Baraković, S., Husić, J.B., Bech, S., Bosse, S., Botev, J., Brunnström, K., Cruz, L., and De Moor, K. (2020, January 25). QUALINET white paper on definitions of immersive media experience (IMEx). Proceedings of the ENQEMSS, 14th QUALINET Meeting, Online. 2. Keelan, B. (2002). Handbook of Image Quality: Characterization and Prediction, CRC Press. 3. On the influence of head-mounted displays on quality rating of omnidirectional images;Sendjasni;Electron. Imaging,2021 4. Deep Neural Networks for No-Reference and Full-Reference Image Quality Assessment;Bosse;IEEE Trans. Image Process.,2018 5. Convolutional Neural Networks for Omnidirectional Image Quality Assessment: A Benchmark;Sendjasni;IEEE Trans. Circuits Syst. Video Technol.,2022
|
|