Least squares, linear

From HandWiki


Let A be an (m,n) matrix with Hepa img411.gif and b an (m,1) matrix. We want to consider the problem

File:Hepa img358.gif

where Hepa img359.gif stands for the best approximate solution in the least squares sense, i.e. we want to minimize the Euclidean norm of the residual r = Ax - b

File:Hepa img592.gif

We want to find the vector x which is closest to b in the column space of A.

Among the different methods to solve this problem, we mention Normal Equations, sometimes ill-conditioned, QR Decomposition, and, most generally, Singular Value Decomposition. For further reading, Hepa img1.gif e.g. Golub89, Branham90, Wong92, Press95.

Example: Let us consider the problem of finding the closest point (vertex) to measurements on straight lines (e.g. trajectories emanating from a particle collision).

This problem can be described by Ax = b with

File:Hepa img594.gif

This is clearly an inconsistent system of linear equations, with more equations than unknowns, a frequently occurring problem in experimental data analysis. The system is, however, not very inconsistent and there is a point that lies ``nearly on all straight lines. The solution can be found with the linear least squares method, e.g. by QR decomposition for solving Ax = b:

File:Hepa img595.gif