Least squares, linear
Let A be an (m,n) matrix with and b an (m,1) matrix. We want to consider the problem
where stands for the best approximate solution in the least squares sense, i.e. we want to minimize the Euclidean norm of the residual r = Ax - b
We want to find the vector x which is closest to b in the column space of A.
Among the different methods to solve this problem, we mention Normal Equations, sometimes ill-conditioned, QR Decomposition, and, most generally, Singular Value Decomposition. For further reading, e.g. Golub89, Branham90, Wong92, Press95.
Example: Let us consider the problem of finding the closest point (vertex) to measurements on straight lines (e.g. trajectories emanating from a particle collision).
This problem can be described by Ax = b with
This is clearly an inconsistent system of linear equations, with more equations than unknowns, a frequently occurring problem in experimental data analysis. The system is, however, not very inconsistent and there is a point that lies ``nearly on all straight lines. The solution can be found with the linear least squares method, e.g. by QR decomposition for solving Ax = b: