0% found this document useful (0 votes)
20 views10 pages

Curve Fitting - Regression

Numerical analysis

Uploaded by

syed shabab alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views10 pages

Curve Fitting - Regression

Numerical analysis

Uploaded by

syed shabab alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

10/16/2024

Curve Fitting
Objective
• Students will be able to fit curves to given
data
• Students will learn to assess the reliability of
the answers and be capable of choosing the
preferred method for a particular problem

Curve Fitting
Application
• Trend Analysis
o Use the pattern of the data to predict the values
of the dependent variables. This can be
extrapolation beyond the limits of the observed
data or interpolation within the range of the data
• Hypothesis Testing
o Computation of model coefficient, if unknown,
that best fit the observed data
o Comparison of existing mathematical model
with measured data to test the adequacy of the
model

1
10/16/2024

Curve Fitting
General Approaches

Curve Fitting

Least-Squares
Interpolation
Regression

Linear Regression Newton Polynomial


Polynomial Regression Lagrange Polynomial

Curve Fitting
Least-Squares Regression & Interpolation
• Least-squares regression
o Used where the data exhibits a significant degree of error
o A single curve is derived to represent the general trend
of the data
o The curve need not to intersect every point, as any
individual point may be incorrect. Rather, the curve is
designed to follow the pattern of the points as a group
• Interpolation
o Used where the data is very precise
o Fit a curve or series of curves that pass directly through
each of the points

2
10/16/2024

Curve Fitting
Least-Squares Regression & Interpolation
• Least-squares regression –
Characterized the general
upward trend of the data with a
line

• Linear interpolation – Connect


the data points consecutively by
straight-line segments

• Polynomial interpolation –
Connect the data points using
simple curves

Least-Squares Regression
Linear Regression
• Simplest form of least-squares approximation
by fitting a straight line to a given data set
[(x1,y1), (x2,y2),………….(xn,yn)]
• Mathematical expression for the straight line
is  y = a0 + a1x + e
Here a0 & a1 are coefficients, and e is the
residual error
• e = y - a0 - a1x (Discrepancy between the true
value of y and the approximate value,
predicted by the linear equation)

3
10/16/2024

Least-Squares Regression
Linear Regression
• Criteria for best fit
o A strategy is to minimize the sum of the squares
of the residuals between the measured y and the y
calculated with the linear model

o The above mentioned criterion will help to yield


a unique line.

Least-Squares Regression
Linear Regression
• Estimating the best-fit parameters
o Using the criterion from previous slide, best
coefficients a0 and a1 can be determined to
minimize the sum of the squares of residual errors

&

Here are the means of y and x.

4
10/16/2024

Least-Squares Regression
Linear Regression
• Quantification of error
o The residual in linear
regression represents the
vertical distance
between a data point
and the estimated
straight line
o Standard error of the
estimate sy/x

Least-Squares Regression
Linear Regression
• Quantification of error

(a)Spread of the data around the (b)Spread of the data around


mean of the dependent variable the best fit line
• Sum of squares of residuals • Sum of squares of residuals
between data points and the between data points and the
mean best-fit line

5
10/16/2024

Least-Squares Regression
Linear Regression
• Quantification of error
o St−Sr , quantifies the improvement or error
reduction due to describing data in terms of a straight
line rather than as an average value.
o , represents the percentage of the
original uncertainty explained by the model
(coefficient of determination) and r is the correlation
coefficient.
 For a perfect fit, Sr = 0, r2 = 1; i.e. the line explains 100%
of the variability of the data
 If Sr = St, r2 = 0; the fit represents no improvement
 If r2 < 0, the model is worse than simply picking the mean

Least-Squares Regression
Linear Regression
• P#1 – Fit a straight line to the x and y values in
the following table
xi yi
1 0.5
2 2.5
3 2.0
4 4.0
5 3.5
6 6.0
7 5.5

• P#2 – Calculate the total standard deviation, the


standard error of the estimate and correlation
coefficient for the data in P#1

6
10/16/2024

Least-Squares Regression
Linear Regression
• Nonlinear Relationships
o Linear regression is established on the fact that the
relationship between the dependent and independent
variables is linear – that is not always true
o The first step in any regression analysis should be to
plot and visually inspect the data to ascertain whether a
linear model applies
o If some data shows curvilinear pattern, then the
following techniques can be applied
 Transform the data in a form that is compatible with
linear regression
 Polynomial Regression

Least-Squares Regression
Linear Regression
• Linearization of Nonlinear Relationship
Model Nonlinear Linearized

Exponential: 

Power: 

Saturation – growth – rate: 

7
10/16/2024

Least-Squares Regression
Linear Regression
• Linearization of Nonlinear Relationship

Least-Squares Regression
Linear Regression
• Linearization of Nonlinear Relationship
• P#3 – Fit the data in table using a logarithmic
transformation.
x y
1 0.5
2 1.7
3 3.4
4 5.7
5 8.4

8
10/16/2024

Least-Squares Regression
Polynomial Regression
• Some engineering data as
in figure (a) is poorly
represented by a straight
line
• A curve like figure (b)
would be better suited to fit
the data
• Least-squares procedure
can be readily extended to
fit the data to a higher-order
polynomial

Least-Squares Regression
Polynomial Regression
• A second order polynomial 
• The sum of the squares of the residuals is

• The coefficients a0, a1 and a2 can be calculated


solving a system of three simultaneous linear
equations

9
10/16/2024

Least-Squares Regression
Polynomial Regression
• The second order case can be easily extended to mth
order polynomial 
• The sum of the squares of the residuals is

• Determining the coefficients of an mth-order polynomial


is equivalent to solving a system of (m+1) simultaneous
linear equations
• The standard error is 

• P#4 – Fit a second-order polynomial to the data in P#3.

10

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy