0% found this document useful (0 votes)
23 views14 pages

Lecture 26 Least Square Regression

Uploaded by

junayed.rafid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views14 pages

Lecture 26 Least Square Regression

Uploaded by

junayed.rafid
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

CSE330(Numerical Methods)

LECTURE 26 – CURVE FITTING


[LINEAR LEAST SQUARE REGRESSION]

INSTRUCTOR: NAHIAN IBN HASAN


L E C T U R E R , D E PA R T M E N T O F E E E ,
B R A C U N I V E R S I T Y, D H A K A , B A N G L A D E S H
Curve Fitting: Overview
❑Data are often given for discrete values along a continuum. However, you may require
estimates at points between the discrete values. Hence, we fit curves to such data to obtain
intermediate estimates.

❑ Sometimes, it may require a simplified version of a complicated function. One way to do this is
to compute values of the function at a number of discrete values along the range of interest.
Then, a simpler function may be derived to fit these values.

❑There are two general approaches for curve fitting that are distinguished from each other on
the basis of the amount of error associated with these data.
i. Regression
ii. Interpolation

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 2


Linear Regression
❑The simplest example of a least-squares approximation is fitting a straight line to a set of paired
observations: (x1, y1), (x2, y2), . . . , (xn, yn). The mathematical expression for the straight line is

❑where a0 and a1 are coefficients representing the intercept and the slope, respectively, and e is
the error, or residual, between the model and the observations.

Approximated Value at
x=5 by the fitted curve

True Value at x=5

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 3


Linear Regression
The equation of the fitted curve can be rewritten as-

Thus, the error, or residual, is the discrepancy between the true value of y and the approximate
value, a0 + a1x, predicted by the linear equation.

❑The target is to minimize the error. Hence, we need a criteria for reducing the error (e).

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 4


Linear Regression: Best Fit Criteria
3. Criteria 3:
“To minimize the sum of the squares of the residuals between the measured y and the y calculated with
the linear model”, as in-

This criterion has a number of advantages, including the fact that it yields a unique line for a given set of
data with the minimized error.

Among the 3 criteria, the third criteria is the best one. Hence, the method is named Least Square
Regression.

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 7


Linear Regression: Determine a0 and a1
❑We need to minimize the squared error function-

To determine values for a0 and a1, the equation is differentiated with respect to each coefficient:

Note that we have simplified the summation symbols; unless otherwise indicated, all
summations are from i = 1 to n.

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 8


Linear Regression: Determine a0 and a1
Setting these derivatives equal to zero will result in a minimum Sr. If this is done, the equations
can be expressed as-

Now, realizing that σ a0 = na0, we can express the equations as a set of two simultaneous linear
equations with two unknowns (a0 and a1)-

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 9


Linear Regression: Determine a0 and a1
Solving this system of two equations, we can derive the values of a0 and a1, which are for the lowest
regression error.

Where 𝑦 and x are the means of y and x, respectively.


❑The standard deviation of the regression line can be determined as-

where sy/x is called the standard error of the estimate

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 10


Linear Regression

Data fitted with Data fitted with


mean line regression line

DEPARTMENT OF EEE, BRAC UNIVERSITY 11


Linear Regression: Example
❑Example: Fit a straight line to the x and y values in the Table.
❑Solution:
The following quantities can be computed first to determine a0 and a1:

Next, we calculate the values of a0 and a1, using the regression formulae-

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 12


Linear Regression: Example
Therefore, the least-squares fit is

The square errors for each of the data points are illustrated in the following table-

Least Squared Error

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 13


Linearization of Non-Linear Relationships

1/y = 1/a3 + (β3/a3)(1/x)


lny = lna1 + β1x

logy = loga2 + β2. logx

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 14


Conclusion
❑In this lesson we have
➢learned about the difference between regression and interpolation
➢known about the least square linear regression.
➢learned how to fit a linear regression curve in a linearly dependent data points.
➢seen how to linearize a non-linear problem.

❑Please try out the MATLAB code for Least Square Linear Regression.

In the next lesson, we will learn about Polynomial Regression and Multiple Linear Regression.

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 15


Thank You

Nahian Ibn Hasan DEPARTMENT OF EEE, BRAC UNIVERSITY 16

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy