0% found this document useful (0 votes)
35 views

Multi Col Linearity

The document discusses multicollinearity in regression analysis. It provides three examples: 1) High correlation between independent variables X2 and X3 indicates multicollinearity. 2) A regression including X2 and X3 produces insignificant t-statistics, further suggesting multicollinearity. 3) Auxiliary regressions of each independent variable on the other produce extremely high R-squared values, confirming multicollinearity. Omitting one of the correlated variables remedies the multicollinearity, as regressions using only X2 or only X3 then have significant t-statistics.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

Multi Col Linearity

The document discusses multicollinearity in regression analysis. It provides three examples: 1) High correlation between independent variables X2 and X3 indicates multicollinearity. 2) A regression including X2 and X3 produces insignificant t-statistics, further suggesting multicollinearity. 3) Auxiliary regressions of each independent variable on the other produce extremely high R-squared values, confirming multicollinearity. Omitting one of the correlated variables remedies the multicollinearity, as regressions using only X2 or only X3 then have significant t-statistics.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

ASSIGNMENT NO # 2,3/16/2019

APPLIED ECONOMICS
SUBMITTED TO
MR, FALAK SHER
SUBMITTED BY
QAISAR SHAHZAD
ROLL NO
BECF15M033
SEMESTER
8TH
DEPARTMENT
ECONOMICS

UNIVERSITY OF SARGODHA
Multicollinearity:
When there is high correlation between independent variable or highly correlated to
each other.
 R square value is high.
 Prob value of t-statistics is insignificant.
 Sign of co-efficient is not according to economic theory.

1) Correlation:
X2 X3 Y

X2 1 0.999 0.857

X3 0.999 1 0.857

Y 0.857 0.857 1

In the above table, we want to check the correlation between X2 & X3. The above
table shows that X2 & X3 is highly co-related to each other (i-e 0.99). The change in the
dependent variable is cause by X2 & X3 i-e 0.99. As results, there is Multicollinearity
problem in the data.

2) Regression:
We run multiple regression in E-views software.

Dependent Variable: Y
Method: Least Squares
Date: 03/16/19 Time: 15:50
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C 35.86766 19.38717 1.850073 0.0778


X2 -6.326498 33.75096 -0.187446 0.8530
X3 1.789761 8.438325 0.212099 0.8340

R-squared 0.735622 Mean dependent var 169.3680


Adjusted R-squared 0.711587 S.D. dependent var 79.05857
S.E. of regression 42.45768 Akaike info criterion 10.44706
Sum squared resid 39658.40 Schwarz criterion 10.59332
Log likelihood -127.5882 Hannan-Quinn criter. 10.48763
F-statistic 30.60702 Durbin-Watson stat 2.875574
Prob(F-statistic) 0.000000

H0 = high Multicollinearity
H1 = No Multicollinearity
The output of E-views shows that the Prob value of t-Statistic is insignificant. So we accept
HO and reject H1 and there is Multicollinearity problem in the data.
3) Auxiliary Regression:

 X2 = f(X3)

Dependent Variable: X2
Method: Least Squares
Date: 03/16/19 Time: 16:11
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C -0.117288 0.117251 -1.000310 0.3276


X3 0.250016 0.000164 1521.542 0.0000

R-squared 0.999990 Mean dependent var 159.4320


Adjusted R-squared 0.999990 S.D. dependent var 81.46795
S.E. of regression 0.262305 Akaike info criterion 0.237999
Sum squared resid 1.582488 Schwarz criterion 0.335509
Log likelihood -0.974992 Hannan-Quinn criter. 0.265045
F-statistic 2315090. Durbin-Watson stat 2.082420
Prob(F-statistic) 0.000000

The results of Auxiliary regression shows that R^2 value is too high, which indicate that
there is Multicollinearity problem in the data.

 X3 = f (X2)

Dependent Variable: X3
Method: Least Squares
Date: 03/16/19 Time: 16:16
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C 0.475455 0.468694 1.014425 0.3209


X2 3.999702 0.002629 1521.542 0.0000

R-squared 0.999990 Mean dependent var 638.1560


Adjusted R-squared 0.999990 S.D. dependent var 325.8492
S.E. of regression 1.049146 Akaike info criterion 3.010449
Sum squared resid 25.31629 Schwarz criterion 3.107959
Log likelihood -35.63062 Hannan-Quinn criter. 3.037494
F-statistic 2315090. Durbin-Watson stat 2.082501
Prob(F-statistic) 0.000000

The results of Auxiliary regression shows that R^2 value (i-e 0.999) is too high, which
indicate that there is Multicollinearity problem in the data.
4) Omitting Independent Variable (X3):

 Y = f (X2)

Dependent Variable: Y
Method: Least Squares
Date: 03/16/19 Time: 16:21
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C 36.71861 18.56953 1.977358 0.0601


X2 0.832012 0.104149 7.988678 0.0000

R-squared 0.735081 Mean dependent var 169.3680


Adjusted R-squared 0.723563 S.D. dependent var 79.05857
S.E. of regression 41.56686 Akaike info criterion 10.36910
Sum squared resid 39739.49 Schwarz criterion 10.46661
Log likelihood -127.6138 Hannan-Quinn criter. 10.39615
F-statistic 63.81897 Durbin-Watson stat 2.921548
Prob(F-statistic) 0.000000

When we omit the X3 variable then results of regression shows that Prob value of t-Statistic is
significant and there is no Multicollinearity problem in the data, because we accept H1 and
reject Ho.
 Y = f (X3)

Dependent Variable: Y
Method: Least Squares
Date: 03/16/19 Time: 16:26
Sample: 1 25
Included observations: 25

Variable Coefficient Std. Error t-Statistic Prob.

C 36.60968 18.57637 1.970766 0.0609


X3 0.208034 0.026033 7.991106 0.0000

R-squared 0.735199 Mean dependent var 169.3680


Adjusted R-squared 0.723686 S.D. dependent var 79.05857
S.E. of regression 41.55758 Akaike info criterion 10.36866
Sum squared resid 39721.74 Schwarz criterion 10.46617
Log likelihood -127.6082 Hannan-Quinn criter. 10.39570
F-statistic 63.85778 Durbin-Watson stat 2.916396
Prob(F-statistic) 0.000000
When we omit the X2 variable then results of regression shows that Prob value of t-Statistic is
significant and there is no Multicollinearity problem in the data, because we accept H1 and
reject Ho.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy