Author:
Lewis Adrian S.,Overton Michael L.
Abstract
Optimization problems involving eigenvalues arise in many different mathematical disciplines. This article is divided into two parts. Part I gives a historical account of the development of the field. We discuss various applications that have been especially influential, from structural analysis to combinatorial optimization, and we survey algorithmic developments, including the recent advance of interior-point methods for a specific problem class: semidefinite programming. In Part II we primarily address optimization of convex functions of eigenvalues of symmetric matrices subject to linear constraints. We derive a fairly complete mathematical theory, some of it classical and some of it new. Using the elegant language of conjugate duality theory, we highlight the parallels between the analysis of invariant matrix norms and weakly invariant convex matrix functions. We then restrict our attention further to linear and semidefinite programming, emphasizing the parallel duality theory and comparing primal-dual interior-point methods for the two problem classes. The final section presents some apparently new variational results about eigenvalues of nonsymmetric matrices, unifying known characterizations of the spectral abscissa (related to Lyapunov theory) and the spectral radius (as an infimum of matrix norms).
Publisher
Cambridge University Press (CUP)
Subject
General Mathematics,Numerical Analysis
Cited by
174 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献