Lecture 26 Least Square Regression
Lecture 26 Least Square Regression
❑ Sometimes, it may require a simplified version of a complicated function. One way to do this is
to compute values of the function at a number of discrete values along the range of interest.
Then, a simpler function may be derived to fit these values.
❑There are two general approaches for curve fitting that are distinguished from each other on
the basis of the amount of error associated with these data.
i. Regression
ii. Interpolation
❑where a0 and a1 are coefficients representing the intercept and the slope, respectively, and e is
the error, or residual, between the model and the observations.
Approximated Value at
x=5 by the fitted curve
Thus, the error, or residual, is the discrepancy between the true value of y and the approximate
value, a0 + a1x, predicted by the linear equation.
❑The target is to minimize the error. Hence, we need a criteria for reducing the error (e).
This criterion has a number of advantages, including the fact that it yields a unique line for a given set of
data with the minimized error.
Among the 3 criteria, the third criteria is the best one. Hence, the method is named Least Square
Regression.
To determine values for a0 and a1, the equation is differentiated with respect to each coefficient:
Note that we have simplified the summation symbols; unless otherwise indicated, all
summations are from i = 1 to n.
Now, realizing that σ a0 = na0, we can express the equations as a set of two simultaneous linear
equations with two unknowns (a0 and a1)-
Next, we calculate the values of a0 and a1, using the regression formulae-
The square errors for each of the data points are illustrated in the following table-
❑Please try out the MATLAB code for Least Square Linear Regression.
In the next lesson, we will learn about Polynomial Regression and Multiple Linear Regression.