Conjugate gradient squared method

From HandWiki
Short description: Algorithm for solving matrix-vector equations

In numerical linear algebra, the conjugate gradient squared method (CGS) is an iterative algorithm for solving systems of linear equations of the form A𝐱=𝐛, particularly in cases where computing the transpose AT is impractical.[1] The CGS method was developed as an improvement to the biconjugate gradient method.[2][3][4]

Background

A system of linear equations A𝐱=𝐛 consists of a known matrix A and a known vector 𝐛. To solve the system is to find the value of the unknown vector 𝐱.[3][5] A direct method for solving a system of linear equations is to take the inverse of the matrix A, then calculate 𝐱=A1𝐛. However, computing the inverse is computationally expensive. Hence, iterative methods are commonly used. Iterative methods begin with a guess 𝐱(0), and on each iteration the guess is improved. Once the difference between successive guesses is sufficiently small, the method has converged to a solution.[6][7]

As with the conjugate gradient method, biconjugate gradient method, and similar iterative methods for solving systems of linear equations, the CGS method can be used to find solutions to multi-variable optimisation problems, such as power-flow analysis, hyperparameter optimisation, and facial recognition.[8]

Algorithm

The algorithm is as follows:[9]

  1. Choose an initial guess 𝐱(0)
  2. Compute the residual 𝐫(0)=𝐛A𝐱(0)
  3. Choose 𝐫~(0)=𝐫(0)
  4. For i=1,2,3, do:
    1. ρ(i1)=𝐫~T(i1)𝐫(i1)
    2. If ρ(i1)=0, the method fails.
    3. If i=1:
      1. 𝐩(1)=𝐮(1)=𝐫(0)
    4. Else:
      1. β(i1)=ρ(i1)/ρ(i2)
      2. 𝐮(i)=𝐫(i1)+βi1𝐪(i1)
      3. 𝐩(i)=𝐮(i)+β(i1)(𝐪(i1)+β(i1)𝐩(i1))
    5. Solve M𝐩^=𝐩(i), where M is a pre-conditioner.
    6. 𝐯^=A𝐩^
    7. α(i)=ρ(i1)/𝐫~T𝐯^
    8. 𝐪(i)=𝐮(i)α(i)𝐯^
    9. Solve M𝐮^=𝐮(i)+𝐪(i)
    10. 𝐱(i)=𝐱(i1)+α(i)𝐮^
    11. 𝐪^=A𝐮^
    12. 𝐫(i)=𝐫(i1)α(i)𝐪^
    13. Check for convergence: if there is convergence, end the loop and return the result

See also

References

  1. ↑ "Conjugate Gradient Squared Method". Wolfram Mathworld. https://mathworld.wolfram.com/ConjugateGradientSquaredMethod.html. 
  2. ↑ Mathworks. "cgs". https://au.mathworks.com/help/matlab/ref/cgs.html. 
  3. ↑ 3.0 3.1 Henk van der Vorst (2003). "Bi-Conjugate Gradients". Iterative Krylov Methods for Large Linear Systems. Cambridge University Press. ISBN 0-521-81828-1. 
  4. ↑ Peter Sonneveld (1989). "CGS, A Fast Lanczos-Type Solver for Nonsymmetric Linear systems". SIAM Journal on Scientific and Statistical Computing 10 (1): 36–52. doi:10.1137/0910004. ProQuest 921988114. https://www.proquest.com/docview/921988114. 
  5. ↑ "Linear equations", Matrix Analysis and Applied Linear Algebra, Philadelphia, PA: SIAM, pp. 1-40, doi:10.1137/1.9780898719512.ch1, http://www.matrixanalysis.com/Chapter1.pdf, retrieved 2023-12-18 
  6. ↑ "Iterative Methods for Linear Systems". Mathworks. https://au.mathworks.com/help/matlab/math/iterative-methods-for-linear-systems.html. 
  7. ↑ Jean Gallier. "Iterative Methods for Solving Linear Systems". UPenn. https://www.cis.upenn.edu/~cis5150/cis515-12-sl5.pdf. 
  8. ↑ "Conjugate gradient methods". Cornell University. https://optimization.cbe.cornell.edu/index.php?title=Conjugate_gradient_methods. 
  9. ↑ R. Barrett; M. Berry; T. F. Chan; J. Demmel; J. Donato; J. Dongarra; V. Eijkhout; R. Pozo et al. (1994). Templates for the Solution of Linear Systems: Building Blocks for Iterative Methods, 2nd Edition. SIAM. https://netlib.org/linalg/html_templates/Templates.html.