Artistic Style Recognition: Combining Deep and Shallow Neural Networks for Painting Classification
-
Published:2023-11-07
Issue:22
Volume:11
Page:4564
-
ISSN:2227-7390
-
Container-title:Mathematics
-
language:en
-
Short-container-title:Mathematics
Author:
Imran Saqib1, Naqvi Rizwan Ali2ORCID, Sajid Muhammad3ORCID, Malik Tauqeer Safdar4ORCID, Ullah Saif3, Moqurrab Syed Atif5ORCID, Yon Dong Keon6ORCID
Affiliation:
1. Department of Computer Science, Muhammad Nawaz Sharif University of Agriculture, Multan 66000, Pakistan 2. Department of Intelligent Mechatronics Engineering, Sejong University, Seoul 05006, Republic of Korea 3. Department of Computer Science, Air University Islamabad, Multan Campus, Multan 60001, Pakistan 4. Department of Information Technology, Bahauddin Zakariya University, Multan 60800, Pakistan 5. School of Computing, Gachon University, Seongnam 13120, Republic of Korea 6. Center for Digital Health, Medical Science Research Institute, Kyung Hee University Medical Center, Kyung Hee University College of Medicine, Seoul 02447, Republic of Korea
Abstract
This study’s main goal is to create a useful software application for finding and classifying fine art photos in museums and art galleries. There is an increasing need for tools to swiftly analyze and arrange art collections based on their artistic styles as a result of the digitization of art collections. To increase the accuracy of the style categorization, the suggested technique involves two parts. The input image is split into five sub-patches in the first stage. A DCNN that has been particularly trained for this task is then used to classify each patch individually. A decision-making module using a shallow neural network is part of the second phase. Probability vectors acquired from the first-phase classifier are used to train this network. The results from each of the five patches are combined in this phase to deduce the final style classification for the input image. One key advantage of this approach is employing probability vectors rather than images, and the second phase is trained separately from the first. This helps compensate for any potential errors made during the first phase, improving accuracy in the final classification. To evaluate the proposed method, six various already-trained CNN models, namely AlexNet, VGG-16, VGG-19, GoogLeNet, ResNet-50, and InceptionV3, were employed as the first-phase classifiers. The second-phase classifier was implemented as a shallow neural network. By using four representative art datasets, experimental trials were conducted using the Australian Native Art dataset, the WikiArt dataset, ILSVRC, and Pandora 18k. The findings showed that the recommended strategy greatly surpassed existing methods in terms of style categorization accuracy and precision. Overall, the study assists in creating efficient software systems for analyzing and categorizing fine art images, making them more accessible to the general public through digital platforms. Using pre-trained models, we were able to attain an accuracy of 90.7. Our model performed better with a higher accuracy of 96.5 as a result of fine-tuning and transfer learning.
Funder
Korea Health Industry Development Institute National Research Foundation
Subject
General Mathematics,Engineering (miscellaneous),Computer Science (miscellaneous)
Reference48 articles.
1. DeWitt, D.J., Larmann, R.M., and Shields, M.K. (2015). Gateways to Art Understanding the Visual Arts, Thames & Hudson. [2nd ed.]. 2. Zhu, W., Zeng, N., and Wang, N. (2010, January 14–17). Sensitivity, specificity, accuracy, associated confidence interval and ROC analysis with practical SAS implemen- tations. Proceedings of the NESUG: Proceedings: Health Care Life Sciences, Baltimore, MD, USA. 3. ImageNet large scale visual recognition chal- lenge;Russakovsky;Int. J. Comput. Vis.,2015 4. Fichner-Rathus, L. (2010). Understanding Art, Wadsworth. [9th ed.]. 5. Lombardi, T.E. (2005). The Classification of Style in Fine-Art Painting. [Ph.D. Thesis, School of Computer Science and Information Systems, Pace University].
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|