An Improved Adam’s Algorithm for Stomach Image Classification
-
Published:2024-06-21
Issue:7
Volume:17
Page:272
-
ISSN:1999-4893
-
Container-title:Algorithms
-
language:en
-
Short-container-title:Algorithms
Author:
Sun Haijing1, Yu Hao2, Shao Yichuan1, Wang Jiantao2, Xing Lei3, Zhang Le1ORCID, Zhao Qian4
Affiliation:
1. School of Intelligent Science and Engineering, Shenyang University, Shenyang 110044, China 2. School of Information Engineering, Shenyang University, Shenyang 110044, China 3. School of Chemistry and Chemical Engineering, University of Surrey, Guildford GU2 7XH, UK 4. School of Science, Shenyang University of Technology, Shenyang 110044, China
Abstract
Current stomach disease detection and diagnosis is challenged by data complexity and high dimensionality and requires effective deep learning algorithms to improve diagnostic accuracy. To address this challenge, in this paper, an improved strategy based on the Adam algorithm is proposed, which aims to alleviate the influence of local optimal solutions, overfitting, and slow convergence rates by controlling the restart strategy and the gradient norm joint clipping technique. This improved algorithm is abbreviated as the CG-Adam algorithm. The control restart strategy performs a restart operation by periodically checking the number of steps and once the number of steps reaches a preset restart period. After the restart is completed, the algorithm will restart the optimization process. It helps the algorithm avoid falling into the local optimum and maintain convergence stability. Meanwhile, gradient norm joint clipping combines both gradient clipping and norm clipping techniques, which can avoid gradient explosion and gradient vanishing problems and help accelerate the convergence of the optimization process by restricting the gradient and norm to a suitable range. In order to verify the effectiveness of the CG-Adam algorithm, experimental validation is carried out on the MNIST, CIFAR10, and Stomach datasets and compared with the Adam algorithm as well as the current popular optimization algorithms. The experimental results demonstrate that the improved algorithm proposed in this paper achieves an accuracy of 98.59%, 70.7%, and 73.2% on the MNIST, CIFAR10, and Stomach datasets, respectively, surpassing the Adam algorithm. The experimental results not only prove the significant effect of the CG-Adam algorithm in accelerating the model convergence and improving generalization performance but also demonstrate its wide potential and practical application value in the field of medical image recognition.
Reference20 articles.
1. Yun, J. (2024). StochGradAdam: Accelerating Neural Networks Training with Stochastic Gradient Sampling. arXiv. 2. Xia, L., and Massei, S. (2023). AdamL: A fast adaptive gradient method incorporating loss function. arXiv. 3. Tang, Q., Shpilevskiy, F., and Lécuyer, M. (2023). DP-AdamBC: Your DP-Adam Is Actually DP-SGD (Unless You Apply Bias Correction). arXiv. 4. Kleinsorge, A., Kupper, S., Fauck, A., and Rothe, F. (2023). ELRA: Exponential learning rate adaption gradient descent optimization method. arXiv. 5. Hong, Y., and Lin, J. (2023). High Probability Convergence of Adam Under Unbounded Gradients and Affine Variance Noise. arXiv.
|
|