0% found this document useful (0 votes)
10 views16 pages

10-Numerical Methods-LS Regression-Part1

The document discusses least-squares regression, focusing on linear regression as a method for fitting a straight line to paired observations. It outlines the mathematical formulation, criteria for determining the best fit, and provides an example of fitting a line to specific data points. Additionally, it covers error analysis, standard deviation, and the coefficients of determination and correlation.

Uploaded by

sraiden49
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views16 pages

10-Numerical Methods-LS Regression-Part1

The document discusses least-squares regression, focusing on linear regression as a method for fitting a straight line to paired observations. It outlines the mathematical formulation, criteria for determining the best fit, and provides an example of fitting a line to specific data points. Additionally, it covers error analysis, standard deviation, and the coefficients of determination and correlation.

Uploaded by

sraiden49
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

THE UNIVERSITY OF JORDAN

School of Engineering
Department of Civil Engineering
(0951301): Numerical Methods
Dr. Ramia Al-Ajarmeh

Curve Fitting
Least-Squares Regression
Part 1
Least-Squares Regression
Refer to the Textbook, Chapter 17

• Linear Regression
• Polynomial Regression
• Multiple linear Regression
• Non-linear Regression
Linear Least-Squares Regression
The simplest example of a least-squares approximation is fitting a
straight line to a set of paired observations: (x1, y1), (x2, y2), . . . , (xn,
yn).

The mathematical expression for the straight line is:


y = a0 + a1 x + e

e = y – (a0 + a1x)

• where a0 and a1 are coefficients representing the intercept and the


slope, respectively.
• e is the error, or residual, between the model and the
observations.
• y is the true value and a0 + a1x is the approximate value predicted
by the linear equation.
Linear Least-Squares Regression

ytrue = yapproximate + e

The perfect line fit retrieves:


∑(ytrue – yapproximate) = 0

Note:
ytrue ≡ actual ≡ observed ≡ measured
yapproximate ≡ predicted ≡ assumed ≡ calculated ≡ modeled
Linear Least-Squares Regression

Criteria for a “Best” Fit:

(1) One strategy for fitting a “best” line through the data would be to
minimize the sum of the residual errors for all the available data, as in:

where n = total number of points. However, this is an inadequate


criterion, as illustrated by the figure below which depicts the fit of a
straight line to two points:
Linear Least-Squares Regression

Criteria for a “Best” Fit:

(2) The second possible criterion might be to minimize the sum of the
absolute values of the discrepancies, as in:

This criterion is also inadequate as shown in the figure below:

For the four points shown, any straight line falling within the dashed
lines will minimize the sum of the absolute values. Thus, this criterion
also does not yield a unique best fit.
Linear Least-Squares Regression

Criteria for a “Best” Fit:

(3) A strategy that overcomes the shortcomings of the aforementioned


approaches is to minimize the sum of the squares of the residuals
between the measured y and the y calculated with the linear model:

This criterion has a number of advantages, including the fact that it


yields a unique line for a given set of data.
Linear Least-Squares Regression

Least-Squares Fit of a Straight Line:

(1) To determine values for a0 and a1, the following equation is


differentiated with respect to each coefficient:
Linear Least-Squares Regression

Least-Squares Fit of a Straight Line:

(2) Setting these derivatives equal to zero will result in a minimum


Sr. If this is done, the equations can be expressed as:

(3) Note that: ∑a0 = na0, we can express the equations as a set of
two simultaneous linear equations with two unknowns (a0 and a1):

These are called the


normal equations
Linear Least-Squares Regression

Least-Squares Fit of a Straight Line:

(4) The normal equations can be solved simultaneously to yield:

where y and x are the means of y and x, respectively


Linear Least-Squares Regression

Example: fit a straight line to the 𝑥 𝑎𝑛𝑑 𝑦 given below:

𝑥𝑖 𝑦𝑖
1 0.5
2 2.5
3 2.0
4 4.0
5 3.5
6 6.0
7 5.5
Linear Least-Squares Regression
Example Solution:

The least-squares fit is:


y = 0.07142857 + 0.8392857x

Comments:
• any line other than the one computed in above example will
result in a larger sum of the squares of the residuals.
• The line produced is the "best-fit“.
Linear Least-Squares Regression

Example Solution:

Computations for an error analysis of the linear fit.


𝑥𝑖 𝑦𝑖 (𝑦𝑖 − 𝑦)
ത 2 (𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 )2
1 0.5 8.5765 0.1687
2 2.5 0.8622 0.5625
3 2.0 2.0408 0.3473
4 4.0 0.3265 0.3265
5 3.5 0.0051 0.5896
6 6.0 6.6122 0.7972
7 5.5 4.2908 0.1993
∑ 28 24 22.7143 2.9911
Linear Least-Squares Regression

A “standard deviation” for the regression line can be determined as:

degrees of freedom

standard error of the estimate

Sy/x is quantifying the spread around the regression line.


Linear Least-Squares Regression

Coefficient of determination (r2) and the Correlation Coefficient (r)

St = ∑(𝑦𝑖 − 𝑦)
ത 2 Sr = ∑(𝑦𝑖 − 𝑎0 − 𝑎1 𝑥𝑖 )2

0 ≤ r2 ≤ 1
-1 ≤ r ≤ 1
Linear Least-Squares Regression

Previous Example:

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy