Abstract
AbstractApproximately sixty years ago two seminal findings, the cutting plane and the subgradient methods, radically changed the landscape of mathematical programming. They provided, for the first time, the practical chance to optimize real functions of several variables characterized by kinks, namely by discontinuities in their derivatives. Convex functions, for which a superb body of theoretical research was growing in parallel, naturally became the main application field of choice. The aim of the paper is to give a concise survey of the key ideas underlying successive development of the area, which took the name of numerical nonsmooth optimization. The focus will be, in particular, on the research mainstreams generated under the impulse of the two initial discoveries.
Publisher
Springer Science and Business Media LLC
Subject
Management Science and Operations Research,General Decision Sciences
Reference149 articles.
1. Akbari, Z., Yousefpour, R., & Reza Peyghami, M. (2014). A new nonsmooth trust region algorithm for locally Lipschitz unconstrained optimization problems. Journal of Optimization Theory and Applications, 164, 733–754.
2. An, L. T. H., & Tao, P. D. (2005). The DC (difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems. Journal of Global Optimization, 133, 23–46.
3. Armijo, L. (1966). Minimization of functions having Lipschitz continuous first partial derivatives. Pacific Journal of Mathematics, 16, 1–3.
4. Astorino, A., Frangioni, A., Gaudioso, M., & Gorgone, E. (2011). Piecewise quadratic approximations in convex numerical optimization. SIAM Journal on Optimization, 21, 1418–1438.
5. Astorino, A., Fuduli, A., & Gaudioso, M. (2019). A Lagrangian relaxation approach for binary Multiple Instance Classification. IEEE Transactions on Neural Networks and Learning Systems, 30, 2662–2671.
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献