Abstract
AbstractForsythe formulated a conjecture about the asymptotic behavior of the restarted conjugate gradient method in 1968. We translate several of his results into modern terms, and pose an analogous version of the conjecture (originally formulated only for symmetric positive definite matrices) for symmetric and nonsymmetric matrices. Our version of the conjecture uses a two-sided or cross iteration with the given matrix and its transpose, which is based on the projection process used in the Arnoldi (or for symmetric matrices the Lanczos) algorithm. We prove several new results about the limiting behavior of this iteration, but the conjecture still remains largely open. We hope that our paper motivates further research that eventually leads to a proof of the conjecture.
Funder
Grantová Agentura České Republiky
Technische Universität Berlin
Publisher
Springer Science and Business Media LLC
Subject
Applied Mathematics,Computational Mathematics,Computer Networks and Communications,Software
Reference27 articles.
1. Afanasjew, M., Eiermann, M., Ernst, O.G., Güttel, S.: A generalization of the steepest descent method for matrix functions. Electron. Trans. Numer. Anal. 28, 206–222 (2007/08)
2. Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Ann. Inst. Statist. Math. Tokyo 11, 1–16 (1959). https://doi.org/10.1007/bf01831719
3. Arnoldi, W.E.: The principle of minimized iteration in the solution of the matrix eigenvalue problem. Quart. Appl. Math. 9, 17–29 (1951). https://doi.org/10.1090/qam/42792
4. Baker, A.H., Jessup, E.R., Manteuffel, T.: A technique for accelerating the convergence of restarted GMRES. SIAM J. Matrix Anal. Appl. 26(4), 962–984 (2005). https://doi.org/10.1137/S0895479803422014
5. Faber, V., Liesen, J., Tichý, P.: On Chebyshev polynomials of matrices. SIAM J. Matrix Anal. Appl. 31(4), 2205–2221 (2009/10). https://doi.org/10.1137/090779486