Abstract
AbstractIn this paper we propose an heuristic to improve the performances of the recently proposed derivative-free method for nonsmooth optimization CS-DFN. The heuristic is based on a clustering-type technique to compute an estimate of Clarke’s generalized gradient of the objective function, obtained via calculation of the (approximate) directional derivative along a certain set of directions. A search direction is then calculated by applying a nonsmooth Newton-type approach. As such, this direction (as it is shown by the numerical experiments) is a good descent direction for the objective function. We report some numerical results and comparison with the original CS-DFN method to show the utility of the proposed improvement on a set of well-known test problems.
Funder
Università degli Studi di Roma La Sapienza
Publisher
Springer Science and Business Media LLC
Subject
Control and Optimization,Business, Management and Accounting (miscellaneous)
Reference16 articles.
1. Astorino, A., Frangioni, A., Gaudioso, M., Gorgone, E.: Piecewise quadratic approximations in convex numerical optimization. SIAM J. Optim. 21(4), 1418–1438 (2011)
2. Audet, C., Le Digabel, S., Rochon Montplaisir, V., Tribes, C.: Nomad version 4: Nonlinear optimization with the mads algorithm. arxiv:2104.1167 (2021)
3. Bagirov, A.M., Karasözen, B., Sezer, M.: Discrete gradient method: derivative-free method for nonsmooth optimization. J. Optim. Theory Appl. 137(2), 317–334 (2008)
4. Clarke, F.H.: Optimization and nonsmooth analysis. SIAM (1990)
5. Dolan, E.D., Morè, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献