Abstract
AbstractComputing more than one eigenvalue for (large sparse) one-parameter polynomial and general nonlinear eigenproblems, as well as for multiparameter linear and nonlinear eigenproblems, is a much harder task than for standard eigenvalue problems. We present simple but efficient selection methods based on divided differences to do this. Selection means that the approximate eigenpair is picked from candidate pairs that satisfy a certain suitable criterion. The goal of this procedure is to steer the process away from already detected pairs. In contrast to locking techniques, it is not necessary to keep converged eigenvectors in the search space, so that the entire search space may be devoted to new information. The selection techniques are applicable to many types of matrix eigenvalue problems; standard deflation is feasible only for linear one-parameter problems. The methods are easy to understand and implement. Although the use of divided differences is well known in the context of nonlinear eigenproblems, the proposed selection techniques are new for one-parameter problems. For multiparameter problems, we improve on and generalize our previous work. We also show how to use divided differences in the framework of homogeneous coordinates, which may be appropriate for generalized eigenvalue problems with infinite eigenvalues. While the approaches are valuable alternatives for one-parameter nonlinear eigenproblems, they seem the only option for multiparameter problems.
Funder
Nederlandse Organisatie voor Wetenschappelijk Onderzoek
Javna Agencija za Raziskovalno Dejavnost RS
Publisher
Springer Science and Business Media LLC
Subject
Computational Mathematics,Algebra and Number Theory
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献