0% found this document useful (0 votes)
30 views12 pages

Chapter 3 - Polynomial Regression and Interaction

Uploaded by

engrjessdeleon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views12 pages

Chapter 3 - Polynomial Regression and Interaction

Uploaded by

engrjessdeleon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

POLYNOMIAL REGRESSION AND

INTERACTION

REPORTED BY: JESSE G. DE LEON JR.


SUBMITTED TO: DR. ANNE MENDOZA-TOMAS
MBA 201 1Y1-7 – ADVANCED MANAGERIAL STATISTICS
INTRODUCTION
Polynomial regression is a kind of regression analysis that models the connection
between a dependent variable y and one or more independent variables x by fitting
a polynomial function to the data. It goes beyond simple linear regression by
allowing for polynomial terms, which can represent non-linear correlations between
variables.

The general form of a polynomial regression model is:

y=β0+β1x+β2x2+β3x3+⋯+βnxn+ϵ
y is the dependent variable.

𝑥 is the independent variable.

𝛽0,𝛽1,𝛽2,…,𝛽𝑛 are the coefficients.


n is the nth degree

𝜖 is the error term.

DISCUSSION

1. Formulation of the Model


A nonlinear relationship between y and x can often be approximately represented
within the general linear model as a polynomial function of x.
Example:

yi = b0 + b1xi + b2xi2 + ei
may be represented as a linear model:
yi = b0 + b1xi1 + b2xi2 + ei
with the transformed variables xi1= xi and xi2 = xi2 .
• The order of the polynomial function is the highest exponent of x; the model
above is a second-order model.
• To estimate a polynomial function x is often first deviated from its mean (or
median) to reduce collinearity between x and higher powers of x. A variable
deviated from its mean is called centered. The transformation is x = X -
X. where x (lower case) represents the centered variable and X (uppercase)
the original (uncentered) variable.
A polynomial function can be used when
• the true response function is polynomial
• the true response function is unknown but a polynomial is a good
approximation of its shape
2. Graphic Representation of the Model
• The response function E{y} for any polynomial model with one predictor
variable can be represented on a 2-dimensional plot of y against x.
• A second-degree polynomial implies a parabolic relationship. The signs of the
coefficients determine the shape of the response function:
• when b2 is positive, y increases as the value of x increases
• when b2 is negative, y eventually decreases as the value of x increases

as shown in these graphs:

NOTES
• when evaluating the shape of a polynomial response function, it is necessary
to keep within the range of x in the data, as extrapolating beyond this range
may lead to misleading predictions
• it is possible to convert from the coefficients of the centered model (involving
x) to the non-centered model involving the original X however, the conversion
is rarely needed for substantive purposes.
• with a second-order polynomial the coefficient of x 2 (x centered) is the same
as that of X2 (X uncentered)
• fitting a polynomial regression with powers higher than three is rarely done
as the interpretation of the coefficients becomes difficult and interpolation
tends to become erratic. (A polynomial of order n-1 can always be fitted
exactly to n points.)
• polynomial regression models are often fitted with the hierarchical
approach in which higher powers are introduced one at a time and tested for
significance, and if a term of a high order is included (say, x 3) then all terms
of lower order (x and x2) are also included.

2. POLYNOMIAL REGRESSION WITH MORE THAN ONE PREDICTOR VARIABLE

Formulation of the Model


A second-order model with two predictors has the general response function
E{Y} = b0 + b1x1 + b2x2 + b11x12 + b22x22 + b12x1x2
Where:
x1 = X1 - X.1
x2 = X2 - X.2
(The x variables are centered.) The indexing of the coefficients reflects the
composition of the corresponding term. The response function is a quadratic
function of x1 and x2. The product term x1x2 represents the interaction of x1 and x2.
The coefficient b12 therefore represents the effect of the interaction of x 1 and x2 on Y
The response function E{y} of a polynomial regression model of any order with two
predictor variables may be represented in 3-dimensional space with dimensions y,
x1, and x2. The response function defines a surface in 3-dimensional space which
can alternatively be represented
• In perspective as a surface in 3-dimensional space
• By contour curves in 2-dimensional (x1, x2) space representing the
combinations of x1 and x2 that yield the same value of the response y,
similar to level curves in topographical maps
• By conditional effects plots in 2-dimensional (y, x1) space representing plots
of y against x1 for (a few) different values of x2
• Example: the model with response function
• E{y} = 1,740 - 4x12 - 3x22 - 3x1x2
• yields the quadratic response surface:

Quadratic Polynomial Regression Model Solved Example in Machine


Learning
Regression modeling is a process of determining a relationship between one
or more independent variables and one dependent or output variable.
Example:
1. Predicting the price of the car given the car model, year of manufacturing,
mileage, engine capacity.
2. Predicting the height of a person given the age of the person.
Solution:
Let the quadratic polynomial regression model be
y=a0+a1*x+a2*x2
The values of a0, a1, and a2 are calculated using the following system of equations:
Given the data we get:
5a0+25a1+135a2 =27.5
25a0+135a1+775a2= 158.8
135a0+775a1+4659a2 =966.2
Solve system of equations using Elimination method
Solving this system of equations we get
a0=12.4285714
a1=-5.5128571
a2=0.7642857
The required quadratic polynomial model is
y=12.4285714 -5.5128571 * x +0.7642857 * x2
Let’s say that we’re given these points
x = [1 2 3 4 5.5 7 10]
y = [3 7 9 15 22 21 21]
and want to explore fits of 2nd., 4th. and 5th. order. We could use the following
code:
clear all, clc, close all, format compact
% define coordinates
x = [1 2 3 4 5.5 7 10];
y = [3 7 9 15 22 21 21];
% plot given data in red
plot(x, y, 'ro', 'lineidth', 2)
hold on
% find polynomial fits of different orders
p2 = polyfit(x, y, 2)
p4 = polyfit(x, y, 4)
p5 = polyfit(x, y, 5)
% see interpolated values of fits
xc = 1 : .1 : 10;
% plot 2nd order polynomial
y2 = polyval(p2, xc);
plot(xc, y2, 'g.-')
% plot 4th order polynomial
y4 = polyval(p4, xc);
plot(xc, y4, 'linewidth', 2)
% plot 5th order polynomial
y5 = polyval(p5, xc);
plot(xc, y5, 'k.', 'linewidth', 2)
grid
legend('original data', '2nd. order fit', '4th. order', '5th. order')
% see interpolated values of fits
xc = 1 : .1 : 10;
% plot 2nd order polynomial
y2 = polyval(p2, xc);
plot(xc, y2, 'g.-')
% plot 4th order polynomial
y4 = polyval(p4, xc);
plot(xc, y4, 'linewidth', 2)
% plot 5th order polynomial
y5 = polyval(p5, xc);
plot(xc, y5, 'k.', 'linewidth', 2)
grid
legend('original data', '2nd. order fit', '4th. order', '5th. order')
The coefficients found by Matlab are:
p2 = -0.3863 6.3983 -4.1596
p4 = 0.0334 -0.7377 5.0158 -8.4137 7.5779
p5 = 0.0213 -0.5022 4.1330 -14.5708 25.3233 -11.3510

which, in other words, represent the following polynomials of 2nd., 4th. and 5th.
order, respectively.

y2 = -0.3863x2 + 6.3983x - 4.1596


y4 = 0.0334x4 - 0.7377x3 + 5.0158x2 - 8.4137x + 7.5779
y5 = 0.0213x5 - 0.5022x4 + 4.1330x3 - 14.5708x2 + 25.3233x - 11.3510
3. INTERACTION REGRESSION MODELS
Interaction regression is a type of statistical analysis used to explore how the
relationship between two variables changes when considering the influence of a
third variable. In other words, it examines whether the effect of one independent
variable on a dependent variable depends on the level of another independent
variable.
Here’s a breakdown of how interaction regression works:
1. Basic Concept: In a standard regression model, you might assess how one
independent variable (say, X1) affects a dependent variable (Y). Interaction
regression extends this by examining how this effect might change
depending on the value of another independent variable (X2).
2. Interaction Term: To test for interactions, you include an interaction term in
your regression model. This term is typically the product of the two
interacting variables. For example, if you are examining how X1 and X2
interact, you include an interaction term like X1 ×X2 in your model.
3. Model Structure: The general form of a regression model with an interaction
term looks like this:
Y=β0+β1X1+β2X2+β3(X1×X2)+ϵ
• Y is the dependent variable.
• X1 and X2 are independent variables.
• X1×X2 is the interaction term.
• β0, β1\beta_1β1, β2\beta_2β2, and β3\beta_3β3 are coefficients to be
estimated.
• ϵ is the error term.
4. Interpreting Interaction Effects:
• The coefficient of the interaction term β3 tells you how the relationship
between X1 and Y changes as X2 changes. For instance:
• If β3 is significantly different from zero, it indicates that the effect of X1 on Y
depends on the level of X2,
• A positive β3 suggests that as X2 increases, the effect of X1 on Y becomes
stronger.
5. Visualization
• Interaction effects are often easier to understand visually. You might plot the
relationship between X1 and Y for different levels of X2 to see how the slopes
change.
6. Application
• Interaction regression is used in various fields to explore complex
relationships. For instance, in social sciences, you might study how the effect
of education on income is different for men and women. In medicine, you
could explore how the effectiveness of a treatment depends on the patient's
age and gender.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy