Author:
Gebken Bennet,Peitz Sebastian
Abstract
AbstractWe present an efficient descent method for unconstrained, locally Lipschitz multiobjective optimization problems. The method is realized by combining a theoretical result regarding the computation of descent directions for nonsmooth multiobjective optimization problems with a practical method to approximate the subdifferentials of the objective functions. We show convergence to points which satisfy a necessary condition for Pareto optimality. Using a set of test problems, we compare our method with the multiobjective proximal bundle method by Mäkelä. The results indicate that our method is competitive while being easier to implement. Although the number of objective function evaluations is larger, the overall number of subgradient evaluations is smaller. Our method can be combined with a subdivision algorithm to compute entire Pareto sets of nonsmooth problems. Finally, we demonstrate how our method can be used for solving sparse optimization problems, which are present in many real-life applications.
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics,Management Science and Operations Research,Control and Optimization
Reference37 articles.
1. Deb, K.: Multi-objective Optimization using Evolutionary Algorithms, vol. 16. Wiley, Hoboken (2001)
2. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)
3. Miettinen, K.: Nonlinear Multiobjective Optimization. Springer, New York (1998)
4. Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)
5. Schäffler, S., Schultz, R., Weinzierl, K.: Stochastic method for the solution of unconstrained vector optimization problems. J. Optim. Theory Appl. 114(1), 209–222 (2002)
Cited by
14 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献