Affiliation:
1. University of Haifa
2. Technion-Israel Institute of Technology
Abstract
Given a linear system
Ax
=
b
, one can construct a related “normal equations” system
AA
T
y
=
b, x
=
A
T
y
. Björck and Elfving have shown that the SSOR algorithm, applied to the normal equations, can be accelerated by the conjugate gradient algorithm (CG). The resulting algorithm, called CGMN, is error-reducing and in theory it always converges even when the equation system is inconsistent and/or nonsquare. SSOR on the normal equations is equivalent to the Kaczmarz algorithm (KACZ), with a fixed relaxation parameter, run in a double (forward and backward) sweep on the original equations. CGMN was tested on nine well-known large and sparse linear systems obtained by central-difference discretization of elliptic convection-diffusion partial differential equations (PDEs). Eight of the PDEs were strongly convection-dominated, and these are known to produce very stiff systems with large off-diagonal elements. CGMN was compared with some of the foremost state-of-the art Krylov subspace methods: restarted GMRES, Bi-CGSTAB, and CGS. These methods were tested both with and without various preconditioners. CGMN converged in all the cases, while none of the preceding algorithm/preconditioner combinations achieved this level of robustness. Furthermore, on varying grid sizes, there was only a gradual increase in the number of iterations as the grid was refined. On the eight convection-dominated cases, the initial convergence rate of CGMN was better than all the other combinations of algorithms and preconditioners, and the residual decreased monotonically. The CGNR algorithm was also tested, and it was as robust as CGMN, but slower.
Publisher
Association for Computing Machinery (ACM)
Subject
Applied Mathematics,Software
Cited by
19 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献