An efficient deflation technique for the communication-avoiding conjugate gradient. (English) Zbl 1312.65040
Summary: By fusing \(s\) loop iterations, communication-avoiding formulations of Krylov subspace methods can asymptotically reduce sequential and parallel communication costs by a factor of \(O(s)\). Although a number of communication-avoiding Krylov methods have been developed, there remains a serious lack of available communication-avoiding preconditioners to accompany these methods. This has stimulated active research in discovering which preconditioners can be made compatible with communication-avoiding Krylov methods and developing communication-avoiding methods which incorporate these preconditioners. In this paper we demonstrate, for the first time, that deflation preconditioning can be applied in communication-avoiding formulations of Lanczos-based Krylov methods such as the conjugate gradient method while maintaining an \(O(s)\) reduction in communication costs. We derive a deflated version of a communication-avoiding conjugate gradient method, which is mathematically equivalent to the deflated conjugate gradient method of Y. Saad et al. [SIAM J. Sci. Comput. 21, No. 5, 1909–1926 (2000; Zbl 0955.65021)]. Numerical experiments on a model problem demonstrate that the communication-avoiding formulations can converge at comparable rates to the classical formulations, even for large values of \(s\). Performance modeling illustrates that \(O(s)\) speedups are possible when performance is communication bound. These results motivate deflation as a promising preconditioner for communication-avoiding Krylov subspace methods in practice.
MSC:
65F10 | Iterative numerical methods for linear systems |
65F08 | Preconditioners for iterative methods |
65F50 | Computational methods for sparse matrices |
65Y05 | Parallel numerical computation |
65Y20 | Complexity and performance of numerical algorithms |