Hubbry Logo
search
logo

Linear least squares

logo
Community Hub0 Subscribers
Write something...
Be the first to start a discussion here.
Be the first to start a discussion here.
See all
Linear least squares

Linear least squares (LLS) is the least squares approximation of linear functions to data. It is a set of formulations for solving statistical problems involved in linear regression, including variants for ordinary (unweighted), weighted, and generalized (correlated) residuals. Numerical methods for linear least squares include inverting the matrix of the normal equations and orthogonal decomposition methods.

Consider the linear equation

where and are given and is variable to be computed. When it is generally the case that (1) has no solution. For example, there is no value of that satisfies because the first two rows require that but then the third row is not satisfied. Thus, for the goal of solving (1) exactly is typically replaced by finding the value of that minimizes some error. There are many ways that the error can be defined, but one of the most common is to define it as This produces a minimization problem, called a least squares problem

The solution to the least squares problem (1) is computed by solving the normal equation

where denotes the transpose of .

Continuing the example, above, with we find and Solving the normal equation gives

The three main linear least squares formulations are:

Other formulations include:

See all
User Avatar
No comments yet.