2 - Time Series Regression (pt.1)
2 - Time Series Regression (pt.1)
IMS
Information
NOVA
Management
School
Bruno Damásio
bdamasio@novaims.unl.pt
@bmpdamasio
Carolina Vasconcelos
cvasconcelos@novaims.unl.pt
@vasconceloscm
2022/2023
Nova Information Management School
NOVA University of Lisbon
Instituto Superior de Estatística e Gestão da Informação
Universidae Nova de Lisboa
NOVA
IMS Table of contents i
Information
Management
School
1
NOVA
IMS
Information
Management
School
2
NOVA
IMS Cross Sectional Data versus Time Series Data
Information
Management
School
3
NOVA
IMS Static Models
Information
Management
School
• Suppose that we have time series data avilable on two variables, say
y and z, where yt and zt are dated contemporaneously.
• A static model relating y to z is
yt = β0 + β1 zt + ut , t = 1, 2, . . . , n (1)
• The name “static model” comes from the fact that we are modeling
a contemporaneous relationship between y and z.
• Usually, a static model is postulated when a change in z at time t is
believed to have and immediate effect on y: ∆yt = β1 ∆zt , when
∆ut = 0.
4
NOVA
IMS Static Models
Information
Management
School
Example
Let mrdrtet denote the murders per 10,000 people in a particular city
during year t, let convrtt denote the murder conviction rate, let unemt
be the local unemployment rate, and let yngmalet be the fraction of the
population consisting of males between the ages 18 and 25. Then, a
static multiple regression model explaining murder rates is
Using a model such as this, we can estimate the ceteris paribus effect of
an increase in the conviction rate on a particular criminal activity.
5
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
where gfrt is the general fertility rate (children born per 1,000 women of
childbearing age) and pet is the real dollar value of the personal tax
exemption.
The goal is to assess whether, in the aggregate, the decision to have
children is linked to the tax value of having a child.
Equation (3) recognizes that, for both biological and behavioral
reasons, decisions to have children would not immediately result from
changes in the personal exemption.
6
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
7
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
...
zt−2 = c
zt−1 = c
zt = c + 1
zt+1 = c
zt+2 = c
...
8
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
yt−1 = α0 + δ0 c + δ1 c + δ2 c
yt = α0 + δ0 (c + 1) + δ1 c + δ2 c
yt+1 = α0 + δ0 c + δ1 (c + 1) + δ2 c
yt+2 = α0 + δ0 c + δ1 c + δ2 (c + 1)
yt+3 = α0 + δ0 c + δ1 c + δ2 c,
9
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
10
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
11
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
yt−1 = α0 + δ0 c + δ1 c + δ2 c
yt = α0 + δ0 (c + 1) + δ1 c + δ2 c
yt+1 = α0 + δ0 (c + 1) + δ1 (c + 1) + δ2 c
yt+2 = α0 + δ0 (c + 1) + δ1 (c + 1) + δ2 (c + 1),
12
NOVA
IMS Finite Distributed Lag Models
Information
Management
School
13
NOVA
IMS Time Series Regression Assumptions
Information
Management
School
14
NOVA
IMS Time Series Regression Assumptions
Information
Management
School
TS4: Homoskedasticity
The error u has the same variance given any value of the explanatory
variable. In other words,
Var(ut | x) = σ 2 , t = 1, 2, . . . , n (7)
15
NOVA
IMS Time Series Regression Assumptions
Information
Management
School
16
NOVA
IMS Unbiasedness of OLS
Information
Management
School
Theorem 1
Unbiasedness of OLS
Under Assumptions TS1 to TS3 we have:
h i
E β̂j | x = βj (8)
17
NOVA
IMS Sampling Variances of the OLS Estimators
Information
Management
School
Theorem 2
Variances of OLS estimators
Under assumptions TS1-TS5 the variance-covariance matrix of the OLS
estimators is:
−1
Var β̂ | X = σ 2 (X′ X)
Theorem 3
Gauss-Markov Theorem
Under assumptions TS1-TS5, the OLS estimators are BLUE (Best
Linear Unbiased Estimator), conditional on X.
18
NOVA Inference under the Classical Linear Model Assump-
IMS
Information
Management
School
tions
TS6: Normality
The errors ut are independent of X and are independently and
identically distributed as Normal(0, σ 2 ).
19
NOVA Inference under the Classical Linear Model Assump-
IMS
Information
Management
School
tions
Theorem 4
Normal Sampling Distributions
Under Assumptions TS.1 through TS.6, the CLM assumptions for time
series, the OLS estimators are normally distributed, conditional on X.
Further, under the null hypothesis, each t statistic has a t distribution,
and each F statistic has an F distribution. The usual construction of
confidence intervals is also valid.
20
NOVA
IMS Functional Form
Information
Management
School
21
NOVA
IMS Functional Form
Information
Management
School
Example
We can use logarithmic functional forms in distributed lag models. For
example, for quarterly data, suppose that money demand (Mt ) and
gross domestic product (GDPt ) are related by
22
NOVA
IMS Dummy Variables
Information
Management
School
Examples
• For annual data, we can indicate in each year whether a Democrat
or a Republican is president of the United States by defining a
variable democt , which is unity if the president is a Democrat, and
zero otherwise.
• Looking at the effects of capital punishment on murder rates in
Texas, we can define a dummy variable for each year equal to one if
Texas had capital punishment during that year, and zero otherwise.
23
NOVA
IMS Dummy Variables
Information
Management
School
Suppose we want to explain the general fertility rate (gfr), i.e., the
number of children born to every 1,000 women of childbearing age. For
the years 1913 through 1984, we have obtained the following estimation:
yt = α0 + α1 t + et , t = 1, 2, . . . , n (10)
25
NOVA
IMS Trending Time Series
Information
Management
School
26
NOVA
IMS Interpreting trend
Information
Management
School
27
NOVA
IMS Interpreting trend
Information
Management
School
log(xt ) = β0 + β1 t + et , t = 1, 2, . . . (11)
28
NOVA
IMS Trending Time Series
Information
Management
School
29
NOVA
IMS Using Trending Variables in Regression Analysis
Information
Management
School
30
NOVA
IMS Using Trending Variables in Regression Analysis
Information
Management
School
Considering
yt = β0 + β1 x1t + β2 x2t + β3 t + ut (12)
31
NOVA
IMS Using Trending Variables in Regression Analysis
Information
Management
School
library(wooldridge)
lm(linvpc ~ lprice, data=hseinv)
##
## Call:
## lm(formula = linvpc ~ lprice, data = hseinv)
##
## Coefficients:
## (Intercept) lprice
## -0.5502 1.2409
##
## Call:
## lm(formula = linvpc ~ lprice + t, data = hseinv)
##
## Coefficients:
## (Intercept) lprice t
## -0.913060 -0.380961 0.009829 32
NOVA
IMS Detrending of Regressions
Information
Management
School
Board.
33
NOVA Computing R-Squared When the Dependent Variable
IMS
Information
Management
School
is Trending
34
NOVA Computing R-Squared When the Dependent Variable
IMS
Information
Management
School
is Trending
SSR
1 − Pn 2
t=1 ÿt
35
NOVA
IMS Seasonality
Information
Management
School
36
NOVA
IMS Seasonality
Information
Management
School
37
NOVA
IMS Seasonality
Information
Management
School
library(wooldridge)
reg_seas <- lm(lchnimp ~ lchempi + feb + mar +
apr + may + jun + jul +
aug + sep + oct + nov + dec,
data = barium)
library(car)
linearHypothesis(reg_seas, c('feb','mar', 'apr', 'may',
'jun', 'jul', 'aug', 'sep',
'oct', 'nov', 'dec'))
• There are some key concepts to learn in order to apply the usual
large sample approximations in regression analysis with time series
data.
• The notion of stationary process has played an important role in
the analysis of time series.
39
NOVA
IMS Stationary and Nonstationary Time Series
Information
Management
School
1. E (xt ) is constant;
2. Var (xt ) is constant;
3. For any t, h ≥ 1, Cov (xt , xt+h ) depends only on h and not on t.
40
NOVA
IMS Stationary and Nonstationary Time Series
Information
Management
School
42
NOVA
IMS Weakly Dependent Time Series
Information
Management
School
43
NOVA
IMS Moving Average Process
Information
Management
School
xt = θ0 + ϵt + θ1 ϵt−1 , t = 1, 2, . . . (14)
44
NOVA
IMS MA(1)
Information
Management
School
45
NOVA
IMS MA(1) process
Information
Management
School
library(tsibble)
set.seed(12345)
u = rnorm(200)
x = numeric(200)
for (i in 2:length(u)) x[i] <- u[i]-0.8*u[i - 1]
e <- tsibble(sample = 1:200, u = x[2:201], index = sample)
46
NOVA
IMS MA(1) process
Information
Management
School
library(fpp3)
e %>%
autoplot(u) + labs(title='MA(1)', y ='', x='')
MA(1)
−2
−4
0 50 100 150 200
47
NOVA
IMS MA(q)
Information
Management
School
w.n.
xt = θ0 + ϵt − θ1 ϵt−1 − θ2 ϵt−2 − . . . − θq ϵt−q ϵt ∼ (0, σϵ2 )
48
NOVA
IMS Autoregressive Process
Information
Management
School
yy = ρ0 + ρ1 yt−1 + ϵt , t = 1, 2, . . . (15)
49
NOVA
IMS AR(1)
Information
Management
School
50
NOVA
IMS AR(1)
Information
Management
School
library(tsibble)
set.seed(12345)
x <- rnorm(201)
u <- rnorm(200)
for (i in 2:length(x)){x[i] <- 2+0.6*x[i - 1] + u[i]}
y <- tsibble(sample = 1:200, x = x[2:201], index = sample)
51
NOVA
IMS AR(1)
Information
Management
School
library(fpp3)
y %>%
autoplot(x) + labs(title='AR(1)', y ='', x='')
AR(1)
8
2
0 50 100 150 200
52
NOVA
IMS AR(p)
Information
Management
School
w.n.
yt = ρ0 + ρ1 yt−1 + . . . + ρp yt−p + ϵt , ϵt ∼ (0, σϵ2 )
ρ0
• Unconditional mean: E(yt ) = µ = 1−ρ1 −...−ρp
53
NOVA
IMS Asymptotic Properties of OLS
Information
Management
School
54
NOVA
IMS Asymptotic Properties of OLS
Information
Management
School
TS4’: Homoskedasticity
Var(ut | xt ) = σ 2 , t = 1, 2, . . . , n (17)
55
NOVA
IMS Asymptotic Properties of OLS
Information
Management
School
Theorem 5
Consistency of OLS
Under Assumptions TS1’ to TS3’, the OLS estimators are consistent:
plim β̂ = βj , j = 0, 1, . . . , k.
Theorem 6
Asymptotic Normality of OLS
Under TS1’-TS5’, the OLS estimators are asymptotically normally
distributed. Further, the usual OLS standard errors, t statistics, F
statistics, and LM statistics are asymptotically valid.
56
NOVA
IMS Highly Persistent Time Series
Information
Management
School
• In the simple AR(1) model, the assumption |ρ1 | > 1 is crucial for the
series to be weakly dependent. It turns out that many economic
time series are better characterized by the AR(1) model with ρ = 1.
In that case, we can write:
yt = yt−1 + ϵt , t = 1, 2, . . . (18)
57
NOVA
IMS Random Walk Process
Information
Management
School
yt = ϵt + ϵt−1 + · · · + ϵ1 + y0
58
NOVA
IMS Random Walk Process
Information
Management
School
Var (yt ) = Var (ϵt ) + Var (ϵt−1 ) + · · · + Var (ϵ1 ) + Var (y0 )
= Var (y0 ) + σϵ2 t
59
NOVA
IMS Highly Persistent Behavior
Information
Management
School
• This means that, no matter how far in the future we look, our best
prediction of yt+h is today’s value, yt .
60
NOVA
IMS Highly Persistent Behavior
Information
Management
School
• We can contrast this with the stable AR(1) case, where a similar
argument can be used to show that
61
NOVA How to verify in practice if a time series is stationary
IMS
Information
Management
School
or not?
62
NOVA
IMS Non stationary Process
Information
Management
School
(Discuss)
63
NOVA
IMS Non stationary Process
Information
Management
School
0 25 50 75 100
5.0
2.5
0.0
0 25 50 75 100
64
NOVA
IMS Trend Stationarity vs Difference Stationarity
Information
Management
School
65
NOVA
IMS Difference Stationarity
Information
Management
School
library(gridExtra)
library(fpp3)
## set random number seed
set.seed(123)
## length of time series
T <- 100
## initialize {x_t} and {w_t}
x <- w <- rnorm(n=T, mean=0, sd=1)
## Random walk without drift
for(t in 3:T) { x[t] <- x[t-1] + w[t] }
y <- tsibble(sample=1:100, x = x, index=sample)
p1 <- y %>%
autoplot(x) + labs(title='Non Stationary Series', x='', y='')
p2 <- y %>%
autoplot(difference(x)) + labs(title='Differenced Series', x='', y='')
grid.arrange(p1, p2)
66
NOVA
IMS Difference Stationarity
Information
Management
School
0 25 50 75 100
Differenced Series
2
1
0
−1
−2
0 25 50 75 100
67
NOVA
IMS Difference Operator
Information
Management
School
• second-order differences
∆d yt = (1 − L)d yt
68
NOVA
IMS Unit root testing: Dickey-Fuller test
Information
Management
School
in the model
∆yt = α + θyt−1 + ut
Under the null tθ ∼ DF
69
NOVA
IMS Dickey-Fuller test
Information
Management
School
library(urca)
summary(ur.df(y, type='drift'))
70
NOVA
IMS Dickey-Fuller test
Information
Management
School
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression drift
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 + 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -2.9637 -0.6220 -0.1578 0.5580 3.0348
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.5314398 0.1831636 8.361 1.17e-14 ***
## z.lag.1 0.0011390 0.0006414 1.776 0.0773 .
## z.diff.lag 0.1042094 0.0722390 1.443 0.1507
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.011 on 195 degrees of freedom
## Multiple R-squared: 0.03125,^^IAdjusted R-squared: 0.02131
## F-statistic: 3.145 on 2 and 195 DF, p-value: 0.04525
##
##
## Value of test-statistic is: 1.7757 60.2901
##
## Critical values for test statistics:
## 1pct 5pct 10pct
## tau2 -3.46 -2.88 -2.57 71
## phi1 6.52 4.63 3.81
NOVA
IMS Dickey-Fuller Test
Information
Management
School
72
NOVA
IMS GTS t-sig procedure
Information
Management
School
library(Quandl)
indpro = Quandl("FRED/INDPRO", collapse="monthly",
start_date="2013-01-01", type="ts")
plot(indpro)
100
indpro
95
90
85
Time
73
NOVA
IMS GTS t-sig procedure
Information
Management
School
library(Quandl)
summary(ur.df(indpro, type='drift', lags=6))
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression drift
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 + 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.4813 -0.2941 0.0985 0.5124 3.1348
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 16.444576 6.407731 2.566 0.0119 *
## z.lag.1 -0.163595 0.063918 -2.559 0.0121 *
## z.diff.lag1 0.308495 0.107051 2.882 0.0049 **
## z.diff.lag2 -0.166367 0.111114 -1.497 0.1377
## z.diff.lag3 0.030113 0.111206 0.271 0.7871
## z.diff.lag4 -0.005395 0.110552 -0.049 0.9612
## z.diff.lag5 0.053911 0.105506 0.511 0.6106
## z.diff.lag6 0.055066 0.104410 0.527 0.5992
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.58 on 94 degrees of freedom 74
## Multiple R-squared: 0.1736,^^IAdjusted R-squared: 0.1121
NOVA
IMS GTS t-sig procedure
Information
Management
School
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression drift
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 + 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.5350 -0.3063 0.0810 0.4891 3.1639
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 15.47025 6.13908 2.520 0.01339 *
## z.lag.1 -0.15391 0.06124 -2.513 0.01363 *
## z.diff.lag1 0.30218 0.10564 2.861 0.00519 **
## z.diff.lag2 -0.17671 0.10877 -1.625 0.10752
## z.diff.lag3 0.02181 0.10927 0.200 0.84223
## z.diff.lag4 -0.02479 0.10410 -0.238 0.81230
## z.diff.lag5 0.06114 0.10332 0.592 0.55543
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.567 on 96 degrees of freedom
## Multiple R-squared: 0.1705,^^IAdjusted R-squared: 0.1186
## F-statistic: 3.288 on 6 and 96 DF, p-value: 0.005511 75
##
NOVA
IMS GTS t-sig procedure
Information
Management
School
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression drift
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 + 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.5203 -0.3170 0.1037 0.5128 3.2623
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 14.550231 5.888871 2.471 0.0152 *
## z.lag.1 -0.144719 0.058741 -2.464 0.0155 *
## z.diff.lag1 0.292762 0.103557 2.827 0.0057 **
## z.diff.lag2 -0.186184 0.106690 -1.745 0.0841 .
## z.diff.lag3 0.001478 0.102928 0.014 0.9886
## z.diff.lag4 -0.015782 0.101977 -0.155 0.8773
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.553 on 98 degrees of freedom
## Multiple R-squared: 0.1674,^^IAdjusted R-squared: 0.1249
## F-statistic: 3.941 on 5 and 98 DF, p-value: 0.002677
## 76
##
NOVA
IMS GTS t-sig procedure
Information
Management
School
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression drift
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 + 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.5129 -0.3027 0.0838 0.5072 3.2508
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 14.7684909 5.6309381 2.623 0.01009 *
## z.lag.1 -0.1469048 0.0561707 -2.615 0.01029 *
## z.diff.lag1 0.2951468 0.1013350 2.913 0.00442 **
## z.diff.lag2 -0.1810848 0.1001664 -1.808 0.07364 .
## z.diff.lag3 -0.0009431 0.1007733 -0.009 0.99255
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.538 on 100 degrees of freedom
## Multiple R-squared: 0.1672,^^IAdjusted R-squared: 0.1339
## F-statistic: 5.02 on 4 and 100 DF, p-value: 0.0009956
##
## 77
## Value of test-statistic is: -2.6153 3.4689
NOVA
IMS GTS t-sig procedure
Information
Management
School
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression drift
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 + 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.5089 -0.3110 0.0751 0.5086 3.2549
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 14.71618 5.36698 2.742 0.00721 **
## z.lag.1 -0.14641 0.05354 -2.735 0.00736 **
## z.diff.lag1 0.29487 0.09460 3.117 0.00237 **
## z.diff.lag2 -0.18205 0.09799 -1.858 0.06609 .
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.523 on 102 degrees of freedom
## Multiple R-squared: 0.1669,^^IAdjusted R-squared: 0.1424
## F-statistic: 6.811 on 3 and 102 DF, p-value: 0.0003133
##
##
## Value of test-statistic is: -2.7348 3.7841 78
##
NOVA
IMS GTS t-sig procedure
Information
Management
School
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.4261 -0.4125 0.0731 0.6087 3.8494
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.23740 0.29657 -4.172 6.72e-05 ***
## z.diff.lag1 0.45087 0.26915 1.675 0.0972 .
## z.diff.lag2 0.17451 0.24251 0.720 0.4736
## z.diff.lag3 0.11719 0.20852 0.562 0.5755
## z.diff.lag4 0.03453 0.17242 0.200 0.8417
## z.diff.lag5 0.02860 0.13262 0.216 0.8297
## z.diff.lag6 0.02035 0.10449 0.195 0.8460
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.633 on 94 degrees of freedom
## Multiple R-squared: 0.4544,^^IAdjusted R-squared: 0.4138 79
## F-statistic: 11.19 on 7 and 94 DF, p-value: 3.172e-10
NOVA
IMS GTS t-sig procedure
Information
Management
School
summary(ur.df(diff_indpro, lags=5))
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.4252 -0.4017 0.0837 0.6307 3.8164
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.21056 0.26544 -4.561 1.51e-05 ***
## z.diff.lag1 0.42361 0.23581 1.796 0.0756 .
## z.diff.lag2 0.14790 0.20575 0.719 0.4740
## z.diff.lag3 0.09240 0.17055 0.542 0.5892
## z.diff.lag4 0.01100 0.13119 0.084 0.9334
## z.diff.lag5 0.01162 0.10344 0.112 0.9108
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.618 on 96 degrees of freedom
## Multiple R-squared: 0.4541,^^IAdjusted R-squared: 0.42
## F-statistic: 13.31 on 6 and 96 DF, p-value: 6.394e-11
## 80
##
NOVA
IMS GTS t-sig procedure
Information
Management
School
summary(ur.df(diff_indpro, lags=4))
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.4128 -0.3895 0.0849 0.6293 3.8196
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.197832 0.232413 -5.154 1.32e-06 ***
## z.diff.lag1 0.410786 0.199097 2.063 0.0417 *
## z.diff.lag2 0.135645 0.168150 0.807 0.4218
## z.diff.lag3 0.080946 0.129533 0.625 0.5335
## z.diff.lag4 0.002342 0.102319 0.023 0.9818
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.601 on 98 degrees of freedom
## Multiple R-squared: 0.4541,^^IAdjusted R-squared: 0.4262
## F-statistic: 16.3 on 5 and 98 DF, p-value: 1.119e-11
##
## 81
## Value of test-statistic is: -5.1539
NOVA
IMS GTS t-sig procedure
Information
Management
School
summary(ur.df(diff_indpro, lags=3))
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.4125 -0.3841 0.0844 0.6187 3.8172
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.19448 0.19611 -6.091 2.1e-08 ***
## z.diff.lag1 0.40775 0.16127 2.528 0.013 *
## z.diff.lag2 0.13270 0.12753 1.040 0.301
## z.diff.lag3 0.07887 0.10069 0.783 0.435
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.585 on 100 degrees of freedom
## Multiple R-squared: 0.454,^^IAdjusted R-squared: 0.4322
## F-statistic: 20.79 on 4 and 100 DF, p-value: 1.713e-12
##
##
## Value of test-statistic is: -6.091 82
##
NOVA
IMS GTS t-sig procedure
Information
Management
School
summary(ur.df(diff_indpro, lags=2))
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.3731 -0.3976 0.0587 0.6589 3.7907
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.10561 0.15936 -6.938 3.74e-10 ***
## z.diff.lag1 0.32501 0.12166 2.672 0.00879 **
## z.diff.lag2 0.07051 0.09945 0.709 0.47994
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.575 on 102 degrees of freedom
## Multiple R-squared: 0.4506,^^IAdjusted R-squared: 0.4344
## F-statistic: 27.88 on 3 and 102 DF, p-value: 2.997e-13
##
##
## Value of test-statistic is: -6.9378
## 83
## Critical values for test statistics:
NOVA
IMS GTS t-sig procedure
Information
Management
School
summary(ur.df(diff_indpro, lags=1))
##
## ###############################################
## # Augmented Dickey-Fuller Test Unit Root Test #
## ###############################################
##
## Test regression none
##
##
## Call:
## lm(formula = z.diff ~ z.lag.1 - 1 + z.diff.lag)
##
## Residuals:
## Min 1Q Median 3Q Max
## -12.2565 -0.3927 0.0431 0.6137 3.7892
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## z.lag.1 -1.03269 0.12067 -8.558 1.09e-13 ***
## z.diff.lag 0.27157 0.09471 2.867 0.00501 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.563 on 104 degrees of freedom
## Multiple R-squared: 0.4481,^^IAdjusted R-squared: 0.4375
## F-statistic: 42.23 on 2 and 104 DF, p-value: 3.759e-14
##
##
## Value of test-statistic is: -8.5583
##
## Critical values for test statistics: 84
## 1pct 5pct 10pct
NOVA
IMS Dynamically Complete Models
Information
Management
School
yt = β0 + β1 xt1 + · · · + βk xtk + ut
86
NOVA
IMS Dynamically Complete Models
Information
Management
School
Consider the output below, where we considered the general fertility rate,
∆gfr, on personal exemption, ∆pe, allowing for two lags of ∆pe:
summary(lm(diff(gfr) ~ diff(pe) + diff(pe_1) + diff(pe_2), data=fertil3))
##
## Call:
## lm(formula = diff(gfr) ~ diff(pe) + diff(pe_1) + diff(pe_2),
## data = fertil3)
##
## Residuals:
## Min 1Q Median 3Q Max
## -9.8307 -2.1842 -0.1912 1.8442 11.4506
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -0.96368 0.46776 -2.060 0.04339 *
## diff(pe) -0.03620 0.02677 -1.352 0.18101
## diff(pe_1) -0.01397 0.02755 -0.507 0.61385
## diff(pe_2) 0.10999 0.02688 4.092 0.00012 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 3.859 on 65 degrees of freedom
## (2 observations deleted due to missingness)
## Multiple R-squared: 0.2325,^^IAdjusted R-squared: 0.1971
## F-statistic: 6.563 on 3 and 65 DF, p-value: 0.0006054
87
NOVA
IMS Dynamically Complete Models
Information
Management
School
##
## Call:
## lm(formula = diff(gfr) ~ diff(pe) + diff(pe_1) + diff(pe_2) +
## diff(gfr_1), data = fertil3)
##
## Residuals:
## Min 1Q Median 3Q Max
## -9.7491 -2.2345 0.0776 1.7393 9.2857
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) -0.702159 0.453799 -1.547 0.126724
## diff(pe) -0.045472 0.025642 -1.773 0.080926 .
## diff(pe_1) 0.002064 0.026778 0.077 0.938800
## diff(pe_2) 0.105135 0.025590 4.108 0.000115 ***
## diff(gfr_1) 0.300242 0.105903 2.835 0.006125 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 3.666 on 64 degrees of freedom
## (2 observations deleted due to missingness)
## Multiple R-squared: 0.3181,^^IAdjusted R-squared: 0.2755
## F-statistic: 7.464 on 4 and 64 DF, p-value: 5.336e-05
88
NOVA
IMS Dynamically Complete Models
Information
Management
School
This means that the model is not dynamically complete. More terms
can be added that are relevant to explain the dependent variable.
The fact that model is not dynamically complete suggests that there
may be serial correlation in the errors.
89