10-Numerical Methods-LS Regression-Part1
10-Numerical Methods-LS Regression-Part1
School of Engineering
Department of Civil Engineering
(0951301): Numerical Methods
Dr. Ramia Al-Ajarmeh
Curve Fitting
Least-Squares Regression
Part 1
Least-Squares Regression
Refer to the Textbook, Chapter 17
• Linear Regression
• Polynomial Regression
• Multiple linear Regression
• Non-linear Regression
Linear Least-Squares Regression
The simplest example of a least-squares approximation is fitting a
straight line to a set of paired observations: (x1, y1), (x2, y2), . . . , (xn,
yn).
e = y – (a0 + a1x)
ytrue = yapproximate + e
Note:
ytrue ≡ actual ≡ observed ≡ measured
yapproximate ≡ predicted ≡ assumed ≡ calculated ≡ modeled
Linear Least-Squares Regression
(1) One strategy for fitting a “best” line through the data would be to
minimize the sum of the residual errors for all the available data, as in:
(2) The second possible criterion might be to minimize the sum of the
absolute values of the discrepancies, as in:
For the four points shown, any straight line falling within the dashed
lines will minimize the sum of the absolute values. Thus, this criterion
also does not yield a unique best fit.
Linear Least-Squares Regression
(3) Note that: ∑a0 = na0, we can express the equations as a set of
two simultaneous linear equations with two unknowns (a0 and a1):
𝑥𝑖 𝑦𝑖
1 0.5
2 2.5
3 2.0
4 4.0
5 3.5
6 6.0
7 5.5
Linear Least-Squares Regression
Example Solution:
Comments:
• any line other than the one computed in above example will
result in a larger sum of the squares of the residuals.
• The line produced is the "best-fit“.
Linear Least-Squares Regression
Example Solution:
degrees of freedom
St = ∑(𝑦𝑖 − 𝑦)
ത 2 Sr = ∑(𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 )2
0 ≤ r2 ≤ 1
-1 ≤ r ≤ 1
Linear Least-Squares Regression
Previous Example: