0% found this document useful (0 votes)
40 views13 pages

Lecture10 - SIMPLE LINEAR REGRESSION

This document provides an overview of simple linear regression including: 1. Defining the simple linear regression model and describing the relationship between dependent and independent variables. 2. Explaining the least squares method used to estimate regression coefficients and the regression equation. 3. Introducing the coefficient of determination (R2) and how it measures the strength of the linear relationship. 4. Listing the assumptions of the error term in linear regression. 5. Discussing hypothesis testing using the t-test and F-test to evaluate if the regression relationship is statistically significant.

Uploaded by

Hiền Nguyễn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views13 pages

Lecture10 - SIMPLE LINEAR REGRESSION

This document provides an overview of simple linear regression including: 1. Defining the simple linear regression model and describing the relationship between dependent and independent variables. 2. Explaining the least squares method used to estimate regression coefficients and the regression equation. 3. Introducing the coefficient of determination (R2) and how it measures the strength of the linear relationship. 4. Listing the assumptions of the error term in linear regression. 5. Discussing hypothesis testing using the t-test and F-test to evaluate if the regression relationship is statistically significant.

Uploaded by

Hiền Nguyễn
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Probability and Statistics

LECTURE 10
SIMPLE LINEAR REGRESSION

10 - 1

Outline

1. Simple Linear Regression Model


2. Least Squares Method
3. Coefficient of Determination
4. Model Assumptions
5. Testing for Significance

10 - 2

Simple Linear Regression

 Managerial decisions often are based on the


relationship between two or more variables.

 Regression analysis can be used to develop an


equation showing how the variables are related.

 The variable being predicted is called the dependent


variable and is denoted by y.
 The variables being used to predict the value of the
dependent variable are called the independent
variables and are denoted by x.

10 - 3
Simple Linear Regression

 Simple linear regression involves one independent


variable and one dependent variable.

 The relationship between the two variables is


approximated by a straight line.

 Regression analysis involving two or more


independent variables is called multiple regression.

10 - 4

Simple Linear Regression


Model
 The equation that describes how y is related to x and
an error term is called the regression model.
 The simple linear regression model is:

Y = 0 + 1X +

where:
0 and 1 are called parameters of the model,
 is a random variable called the error term.

10 - 5

Assumptions About the Error


Term 
1. The error  is a random variable with mean of zero.

2. The variance of  , denoted by s 2, is the same for


all values of the independent variable.

3. The values of  are independent.

4. The error  is a normally distributed random


variable.

10 - 6
Simple Linear Regression
Equation
 The simple linear regression equation is:

E(Y) = 0 + 1X
• Graph of the regression equation is a straight line.
•  0 is the y intercept of the regression line.
•  1 is the slope of the regression line.
• E(y) is the expected value of y for a given x value.

10 - 7

Simple Linear Regression


Equation
 Positive Linear Relationship

E(y)

Regression line

Intercept Slope  1
0 is positive

10 - 8

Simple Linear Regression


Equation
 Negative Linear Relationship

E(y)
Intercept
0 Regression line

Slope 1
is negative

x
10 - 9
Simple Linear Regression
Equation
 No Relationship
E(y)

Intercept Regression line


0
Slope  1
is 0

10 - 10

Estimated Simple Linear


Regression Equation
 The estimated simple linear regression equation

• The graph is called the estimated regression line.


• b0 is the y intercept of the line.
• b1 is the slope of the line.
• is the estimated value of y for a given x value.

10 - 11

Estimation Process

Regression Model Sample Data:


y =  0 +  1x + x y
Regression Equation x1 y1
E(y) =  0 +  1x . .
Unknown Parameters . .
 0,  1 xn yn

Estimated
b0 and b1 Regression Equation
provide estimates of
0 and 1 Sample Statistics
b0, b1
10 - 12
Least Squares Method
Least Squares Criterion

where:
yi = observed value of the dependent variable
for the ith observation
y^i = estimated value of the dependent variable
for the ith observation

10 - 13

Least Squares Method


Slope for the Estimated Regression Equation

where:
xi = value of independent variable for ith
observation
yi = value of dependent variable for ith
_ observation
x = mean value for independent variable
_
y = mean value for dependent variable

10 - 14

Least Squares Method


 y-Intercept for the Estimated Regression Equation

10 - 15
Simple Linear Regression

 Example: Reed Auto Sales


Reed Auto periodically has a special week-long sale.
As part of the advertising campaign Reed runs one or
more television commercials during the weekend
preceding the sale. Data from a sample of 5 previous
sales are shown on the next slide.

10 - 16

Simple Linear Regression


 Example: Reed Auto Sales

Number of Number of
TV Ads (x) Cars Sold (y)
1 14 Note that in
3 24 practice, the
2 18 sample size
1 17 should be
3 27 larger than this.
Sx = 10 Sy = 100

10 - 17

Scatterplot

Reed Auto Sales Estimated Regression Line

10 - 18
Estimated Regression
Equation
 Slope for the Estimated Regression Equation

 y-Intercept for the Estimated Regression Equation

 Estimated Regression Equation

10 - 19

Coefficient of Determination
Relationship Among SST, SSR, SSE
SST = SSR + SSE

where:
SST = total sum of squares
SSR = sum of squares due to regression
SSE = sum of squares due to error

10 - 20

Coefficient of Determination

 The coefficient of determination is:

R2 = SSR/SST

where:
SSR = sum of squares due to regression
SST = total sum of squares

10 - 21
Coefficient of Determination

R2 = SSR/SST = 100/114 = .8772

The regression relationship is very strong; 87.72%


of the variability in the number of cars sold can be
explained by the linear relationship between the
number of TV ads and the number of cars sold.

10 - 22

Sample Correlation
Coefficient

where:
b1 = the slope of the estimated regression
equation

10 - 23

Sample Correlation
Coefficient

The sign of b1 in the equation is “+”.

rxy = +.9366

10 - 24
Assumptions About the Error
Term 
1. The error  is a random variable with mean of zero.

2. The variance of  , denoted by s 2, is the same for


all values of the independent variable.

3. The values of  are independent.

4. The error  is a normally distributed random


variable.

10 - 25

Testing for Significance

To test for a significant regression relationship, we


must conduct a hypothesis test to determine whether
the value of  1 is zero.

Two tests are commonly used:

t Test and F Test

Both the t test and F test require an estimate of s 2,


the variance of  in the regression model.

10 - 26

Testing for Significance


An Estimate of s 2
The mean square error (MSE) provides the estimate
of s 2, and the notation s2 is also used.
s 2 = MSE = SSE/(n - 2)
where:

10 - 27
Testing for Significance
An Estimate of s
• To estimate s we take the square root of s2.
• The resulting s is called the standard error of
the estimate.

10 - 28

Testing for Significance:


t Test
Hypotheses

where
Test Statistic

10 - 29

Testing for Significance:


t Test
 Rejection Rule

Reject H0 if p-value < a


or t < -taor t > ta

where:
ta is based on a t distribution
with n - 2 degrees of freedom

In this example, use α = 0.05.

10 - 30
Testing for Significance:
t Test
1. Determine the hypotheses.

2. Specify the level of significance. a = .05

3. Select the test statistic.

4. State the rejection rule. Reject H0 if p-value < α = .05


or |t| > 3.182 (with
3 degrees of freedom)

10 - 31

Testing for Significance:


t Test
5. Compute the value of the test statistic.

6. Determine whether to reject H0.


t = 4.541 provides an area of .01 in the upper
tail. Hence, the p-value is less than .02. (Also,
t = 4.63 > 3.182.) We can reject H0.
Conclusion: there is enough evidence at 0.05 level
of significance to conclude that a linear relationship
exists between the number of TV Ads and the number
of cars sold.
10 - 32

Running simple linear regression


using R
Suppose that the data have been inputted into the
reed data frame

> fit <- lm(CarsSold ~ TVAds, data = reed)


> summary(fit)

10 - 33
Reading R Outputs
Required for Exam
Call:
lm(formula = CarsSold ~ TVAds, data = reed)

Residuals:
1 2 3 4 5
-1 -1 -2 2 2
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 10.000 2.366 4.226 0.0242 *
TVAds 5.000 1.080 4.629 0.0190 *
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 2.16 on 3 degrees of freedom
Multiple R-squared: 0.8772, Adjusted R-squared: 0.8363
F-statistic: 21.43 on 1 and 3 DF, p-value: 0.01899
10 - 34

Some Cautions about the


Interpretation of Significance Tests

 Rejecting H0:  1 = 0 and concluding that the


relationship between x and y is significant does
not enable us to conclude that a cause-and-effect
relationship is present between x and y.

10 - 35

Self-study: checking model’s


assumptions

10 - 36
Conclusion
 Simple Linear Regression Model
 Least Squares Method
 Coefficient of Determination
 Model Assumptions
 Testing for Significance

10 - 37

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy