0% found this document useful (0 votes)
116 views16 pages

CE324 Lecture 7

This document discusses curve fitting using least-squares regression. It describes least-squares regression as a process that derives a curve to best fit a set of data points by minimizing the sum of squared residuals. It specifically addresses linear regression, polynomial regression, and multiple linear regression. Examples are provided for fitting a straight line, second-order polynomial, and function with two independent variables.

Uploaded by

louiejaymendez16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
116 views16 pages

CE324 Lecture 7

This document discusses curve fitting using least-squares regression. It describes least-squares regression as a process that derives a curve to best fit a set of data points by minimizing the sum of squared residuals. It specifically addresses linear regression, polynomial regression, and multiple linear regression. Examples are provided for fitting a straight line, second-order polynomial, and function with two independent variables.

Uploaded by

louiejaymendez16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Engr. Benzyl M.

Metran
Licensed Geodetic Engineer
Geodetic Engineering Department
College of Engineering and Architecture
CE324 Module 7

CURVE FITTING
- LEAST-SQUARES REGRESSION
CURVE FITTING
• It is the method of constructing a curve or a derived function that will “fit” to
the set of data subject to constraints to obtain intermediate estimates.

• General approaches for curve fitting:

Least-squares regression
Interpolation
CURVE FITTING

LEAST-SQUARES REGRESSION INTERPOLATION


CURVE FITTING
• Engineering applications are generally encountered when fitting data points:

 Trend Analysis
 Hypothesis testing

• Mathematical backgrounds:

 Taylor series expansions


 Finite divided differences, and
 Field of statistics.
LEAST-SQUARES REGRESSION
• A process of deriving a single curve that represents the trend of the data which
exhibit a significant degree of the error due to following the pattern of the data
points taken as a group.

• Types of least-squares regression

 Linear Regression
 Polynomial Regression
 Multiple Linear Regression
LEAST-SQUARES REGRESSION
• Linear Regression
• Fitting a straight line to a set of data points predicted by a mathematical expression of
𝑦 = 𝑎0 + 𝑎1 𝑥
𝑛 𝑛

𝑎0 𝑛 + 𝑎1 𝑥𝑖 = 𝑦𝑖
𝑖=1 𝑖=1

𝑛 𝑛 𝑛

𝑎0 𝑥𝑖 + 𝑎1 𝑥𝑖2 = 𝑥𝑖 𝑦𝑖
𝑖=1 𝑖=1 𝑖=1
LEAST-SQUARES REGRESSION
• Linear regression coefficient formulas:

𝑛 𝑛 𝑛
𝑛 𝑖=1 𝑥𝑖 𝑦𝑖 − 𝑖=1 𝑥𝑖 𝑖=1 𝑦𝑖
𝑎1 = 2
𝑛 2 𝑛
𝑛 𝑥
𝑖=1 𝑖 − 𝑖=1 𝑥𝑖

𝑛 𝑛
𝑖=1 𝑦𝑖 − 𝑎1 𝑖=1 𝑥𝑖
𝑎0 =
𝑛
LEAST-SQUARES REGRESSION
• Polynomial Regression
 Fitting set of data points to a higher-order polynomial function predicted by
mathematical expression (where “m” is the order of the polynomial) of

𝑦 = 𝑎0 + 𝑎1 𝑥 + 𝑎2 𝑥 2 + ⋯ + 𝑎𝑚 𝑥 𝑚

 To determine the coefficients 𝑎0 , 𝑎1 , 𝑎2 , ⋯ , 𝑎𝑚 , solve the system of “m + 1” linear


equations simultaneously.
LEAST-SQUARES REGRESSION
• Supposed in a second-order polynomial with a mathematical expression of

𝑦 = 𝑎0 + 𝑎1 𝑥 + 𝑎2 𝑥 2
𝑛 𝑛 𝑛

𝑎0 𝑛 + 𝑎1 𝑥𝑖 + 𝑎2 𝑥𝑖2 = 𝑦𝑖
𝑖=1 𝑖=1 𝑖=1
𝑛 𝑛 𝑛 𝑛

𝑎0 𝑥𝑖 + 𝑎1 𝑥𝑖2 + 𝑎2 𝑥𝑖3 = 𝑥𝑖 𝑦𝑖
𝑖=1 𝑖=1 𝑖=1 𝑖=1
𝑛 𝑛 𝑛 𝑛

𝑎0 𝑥𝑖2 + 𝑎1 𝑥𝑖3 + 𝑎2 𝑥𝑖4 = 𝑥𝑖2 𝑦𝑖


𝑖=1 𝑖=1 𝑖=1 𝑖=1
LEAST-SQUARES REGRESSION
• Multiple Linear Regression

 Fitting the set of data points to a linear function of two or more independent variables
predicted by a mathematical expression of

𝑦 = 𝑎0 + 𝑎1 𝑥1 + 𝑎2 𝑥2 + ⋯ + 𝑎𝑚 𝑥𝑚

 To determine the coefficients 𝑎0 , 𝑎1 , 𝑎2 , ⋯ , 𝑎𝑚 , solve the system of “m + 1” linear


equations simultaneously.
LEAST-SQUARES REGRESSION
• Supposed in a linear function of two variables with a mathematical expression of

𝑦 = 𝑎0 + 𝑎1 𝑥1 + 𝑎2 𝑥2
𝑛 𝑛 𝑛

𝑎0 𝑛 + 𝑎1 𝑥1𝑖 + 𝑎2 𝑥2𝑖 = 𝑦𝑖
𝑖=1 𝑖=1 𝑖=1
𝑛 𝑛 𝑛 𝑛
2
𝑎0 𝑥1𝑖 + 𝑎1 𝑥1𝑖 + 𝑎2 𝑥1𝑖 𝑥2𝑖 = 𝑥1𝑖 𝑦𝑖
𝑖=1 𝑖=1 𝑖=1 𝑖=1
𝑛 𝑛 𝑛 𝑛
2
𝑎0 𝑥2𝑖 + 𝑎1 𝑥1𝑖 𝑥2𝑖 + 𝑎2 𝑥2𝑖 = 𝑥2𝑖 𝑦𝑖
𝑖=1 𝑖=1 𝑖=1 𝑖=1
LEAST-SQUARES REGRESSION
• EXAMPLE 1: Fit a straight line to the points in the given table
x 1 2 3 4 5 6 7
y 0.5 2.5 2 4 3.5 6 5.5
LEAST-SQUARES REGRESSION
• EXAMPLE 2: Fit a second-order polynomial to the points in the given table
x 0 1 2 3 4 5
y 2.1 7.7 13.6 27.2 40.9 61.1
LEAST-SQUARES REGRESSION
• EXAMPLE 3: Fit a function of two independent variables to the points in the
given table:
𝑥1 0 2 2.5 1 4 7
𝑥2 0 1 2 3 6 2
𝑦 5 10 9 0 3 27
THANK YOU

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy