An Accelerated Double-Proximal Gradient Algorithm for DC Programming
-
Published:2023-11-04
Issue:
Volume:
Page:
-
ISSN:0217-5959
-
Container-title:Asia-Pacific Journal of Operational Research
-
language:en
-
Short-container-title:Asia Pac. J. Oper. Res.
Author:
Li Gaoxi1ORCID,
Yi Ying1ORCID,
Huang Yingquan2ORCID
Affiliation:
1. School of Mathematics and Statistics, Chongqing Technology and Business University, Chongqing 400067, P. R. China, ligaoxicn@126.com
2. Chongqing Key Laboratory of Social Economy and Applied Statistics, Chongqing 400067, P. R. China
Abstract
The double-proximal gradient algorithm (DPGA) is a new variant of the classical difference-of-convex algorithm (DCA) for solving difference-of-convex (DC) optimization problems. In this paper, we propose an accelerated version of the double-proximal gradient algorithm for DC programming, in which the objective function consists of three convex modules (only one module is smooth). We establish convergence of the sequence generated by our algorithm if the objective function satisfies the Kurdyka–[Formula: see text]ojasiewicz (K[Formula: see text]) property and show that its convergence rate is not weaker than DPGA. Compared with DPGA, the numerical experiments on an image processing model show that the number of iterations of ADPGA is reduced by 43.57% and the running time is reduced by 43.47% on average.
Funder
National Natural Science Foundation of China
National Center for Applied Mathematics
Natural Science Foundation Project of Chongqing, Chongqing Science and Technology Commission
Team Building Project for Graduate Tutors in Chongqing
Publisher
World Scientific Pub Co Pte Ltd
Subject
Management Science and Operations Research,Management Science and Operations Research