Idea Transcript
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Introduction to Time Series Analysis Helle Bunzel ISU
January 30, 2009
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Time Series, Introduction A time series is a sequence of data points, measured typically at successive times, spaced at (often uniform) time intervals. Time series analysis comprises methods that attempt to understand such time series, often either to understand the underlying context of the data points, or to make forecasts Fact Time series analysis accounts for the fact that data points taken over time may have an internal structure (such as autocorrelation, trend or seasonal variation) that should be accounted for.
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Time Series, Introduction A time series model will generally re‡ect the fact that observations close together in time will be more closely related than observations further apart. Time series models will often make use of the natural one-way ordering of time so that values in a series for a given time will be expressed as deriving in some way from past values, rather than from future values. The term time series analysis is used to distinguish a problem from: 1 2
Cross-section analysis, where there is no natural ordering of the context of individual observations Spatial data analysis where there is a context that observations (often) relate to geographical locations.
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Time Series, Introduction
Methods for time series analyses are often divided into two classes: frequency-domain methods and time-domain methods. frequency-domain methods can be regarded as model-free analyses well-suited to exploratory investigations. Time-domain methods have a model-free subset consisting of the examination of auto-correlation and cross-correlation analysis, but it is here that partly and fully-speci…ed time series models make their appearance.
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Time Series, Introduction Most of what we will look at will involve creating simple models to: Forecast Interpret Test hypotheses
Standard tool: Decompose data into trend, cyclical component and irregular components. Example:
Helle Bunzel
ISU
Observed: t = 1, ..., 50.
Introduction to Time Series Analysis
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Time Series, Introduction
Trend:Tt = 1 + 0.1t Seasonal component: St = 1.6sin t2π Irregular component: It = 0.7It 1 + εt εt is a pure random error. These are all di¤erence equations, some with stochastic components. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. We work in discrete time because our data is almost always discrete. Some notation: A datapoint is written as yt = f (t ) De…nition of …rst di¤erence:
4 yt = f ( t )
f (t
4 yt + 1 = f ( t + 1 ) 4 yt + 2 = f ( t + 2 )
1) f (t )
yt
yt yt + 1
f (t + 1)
We often write the whole series: f..., yt 2 , yt 1 , yt , yt +1 , yt +2,... g or
1
yt
yt + 2
yt + 1
= fyt g = fyt g t∞=
∞
fyt gt∞=0 Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. First order linear di¤erence equation with a constant coe¢ cient: yt = ayt
1
+ wt
n’th order linear di¤erence equation with constant coe¢ cients: n
yt = a0 + ∑ ai yt
i
+ wt
i =1
wt is called the “forcing process”, often it is modelled as a stochastic variable. As long as the "a"s don’t depend on y or w we can regard them as constants, even if they are determined by economic variables.
Eventually we want to be able to determine the e¤ect on y of changes in w . Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples
Random walk hypothesis: Stock price: yt = yt
1
+ εt
or
4 yt = ε t Here εt is a random error term with E (εt ) = 0. A more general model could be
4yt = a0 + a1 yt
1
+ εt
Then we can test the random walk hypothesis by testing H0 : a 0 = a 1 = 0 Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples
Another example: Stochastic version of Samuleson’s (1939) model: yt ct it
= ct + it = αyt 1 + εct = β (ct ct 1 ) + εit
(1) (2) (3)
where E (εct ) = E (εit ) = 0 We call yt , ct and it the endogenous variables. Similarly yt
1
and ct
1
are predetermined variables
Economic interpretation of the model. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Structural vs Reduced Form
Some useful de…nitions: De…nition Structural equation: Depends on current realization of other endogenous variables De…nition Reduced form equation: Depends only on lags of itself, lags of other endogenous variables and possibly other exogenous variables. By these de…nitions, Equations (1) and (3) are structural equations while (2) is reduced form. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples continued
We can solve the system to get reduced form for all the equations: First investment: it = β (ct ct 1 ) + εit = β (αyt 1 + εct ct 1 ) + εit (4) = αβyt 1 βct 1 + βεct + εit If we use Equation 2 and lag it once, we can re-write this expression further: ct = αyt 1 + εct , ct 1 = αyt 2 + εc t 1 Plug this into (4) above: it = αβyt 1 β (αyt 2 + εc t 1 ) + βεct + εit = αβ (yt 1 yt 2 ) + β (εct εc t 1 ) + εit = αβ∆yt 1 + β∆εct + εit Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples
Now write GDP in reduced form: yt = ct + it Plug in the (reduced form) expressions of consumption and investment:
= αyt 1 + εct + αβ (yt 1 yt 2 ) + β (εct εc t 1 ) + εit = α (1 + β) yt 1 αβyt 2 + (1 + β) εct βεc t 1 + εit Forcing function wt = (1 + β) εct βεc t 1 + εit . The model can be estimated either as a system or in reduced form. yt
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
A look back Why do these models not …t into our framework so far? Recall our assumptions: Assumption 1 Conditional mean of the errors are 0: E [ εi jX ] = 0
8i
Assumption 2 Linear functional form. Assumption 3 X : n k has rank k. Assumption 4 Spherical Disturbances V [εjX ] = E εε0 jX = σ2 In Assumption 5 X is non-stochastic (a known matrix of constants). Assumption 6 Normality of the errors. ε Helle Bunzel Introduction to Time Series Analysis
N 0, σ2 In ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
A look back Clearly Assumption 4 is violated. Recall yt = α (1 + β) yt
1
αβyt
2
+ (1 + β) εct
βεc t
1
+ εit
Also Assumption 1 rarely holds. Suppose yt = α + βyt
1
+ γxt + εt
Often we have feedback, xt is determined by yt 1 , so X contains εt . Instead, the best we might hope for is that the x’s are predetermined: E (εt jx1 , x2 , ..., xt ) = 0 Since the assumptions no longer hold, neither do our results: βˆ unbiased, BLUE, variance σ2 (X 0 X ) Helle Bunzel Introduction to Time Series Analysis
1
, distribution of t
stat etc. ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples
Now let’s try to solve a general …rst-order di¤erence equation: yt = a0 + a1 yt
1
+ wt
Suppose we start at a known value at time t = 0, call it y0 . Then y1 = a0 + a1 y0 + w1 Similarly: y2 = a0 + a1 y1 + w2 y2 = a0 + a1 (a0 + a1 y0 + w1 ) + w2
= a0 + a1 a0 + a12 y0 + a1 w1 + w2 Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples
And y3 : y3 = a0 + a1 y2 + w3 y3 = a0 + a1 a0 + a1 a0 + a12 y0 + a1 w1 + w2 + w3
= a0 + a1 a0 + a12 a0 + a13 y0 + a12 w1 + a1 w2 + w3 We get: yt = a0
t 1
t 1
i =0
i =0
∑ a1i + a1t y0 +
Helle Bunzel Introduction to Time Series Analysis
∑ a1i wt
i
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Examples
We could have obtained the same result through backwards substitution instead: yt
= = = = =
a0 + a1 yt
+ wt a0 + a1 (a0 + a1 yt 2 + wt 1 ) + wt a0 + a1 a0 + a12 yt 2 + a1 wt 1 + wt a0 + a1 a0 + a12 (a0 + a1 yt 3 + wt 2 ) + a1 wt a0 + a1 a0 + a12 a0 + a13 yt 3 + a12 wt 2 + a1 wt
= a0
1
t 1
t 1
i =0
i =0
∑ a1i + a1t y0 +
∑ a1i wt
+ wt 1 + wt 1
i
This way of solving di¤erence equations is called recursive substitution. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. The equation yt = a0
t 1
t 1
i =0
i =0
∑ a1i + a1t y0 +
∑ a1i wt
i
is a solution to the di¤erence equation. The equation yt = a0 + a1 a0 + a12 yt
2
+ a 1 wt
1
+ wt
is true, but not a solution. Why?
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Notation issues
Recall that we assumed the series started at a known value at time t = 0, namely y0 . We got: yt = a 0
t 1
t 1
i =0
i =0
∑ a1i + a1t y0 + ∑ a1i wt
i
In Hamilton he starts with a known value at time t = y 1. Also a0 = 0. As a result you get yt = a1t +1 y
1, therefore
t
1
+ ∑ a1i wt
i
i =0
It is always important to specify initial conditions. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and Dynamic Multipliers. Note that using the solution to the di¤erence equation it is now very easy to determine the e¤ect of a change in the value of w on y . For example, recall: yt = a0
t 1
t 1
i =0
i =0
∑ a1i + a1t y0 + ∑ a1i wt
i
Suppose we wish to determine the e¤ect on yt of a change in w1 : ∂yt = a1t 1 ∂w1 This is the dynamic multiplier. We could also be interested in the cumulative e¤ect on all ys following w1 : ∞
∂yt
∑ ∂w1
t =1
∞
=
∑ a1t
t =1
Helle Bunzel Introduction to Time Series Analysis
1
=
1 1
a1
if ja1 j < 1 ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and Dynamic Multipliers.
Graphs:
The response of y to a single pulse in w is called the impulse response function.
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Nonconvergent sequences. Now, suppose ja1 j < 1 did not hold.
Then we cannot …nd the in…nite limit. We can still …nd the cumulative e¤ect up to time T :
Consider yt = a0
t 1
t 1
i =0
i =0
∑ a1i + a1t y0 + ∑ a1i wt
i
Suppose a1 = 1. Then t 1
yt
= a0 t + y0 +
∑ wt
i
i =0 t
= a0 t + y0 + ∑ wi i =1
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Nonconvergent sequences.
Now, T
∂yt
∑ ∂w1
t =1
T
=
∑ a1t
t =1
1
T
=
∑1=T
t =1
This clearly does not converge, however.
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Nonconvergent sequences.
Note that in this case
= a0 + yt 1 + wt , ∆yt = a0 + wt This is called a Random Walk with Drift. Now consider 25 observations generated according to yt = a1 yt 1 + wt y0 = 1, a1 = 1 yt
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Impact of the AR coe¢ cient. Random walk vs White noise 3 2 1 0 -1
w 1
2
3
4
5
6
7
8
9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
y
-2 -3 -4 # obs
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Impact of the AR coe¢ cient. We distinguish between stationary and non-stationary processes. Impulse response function of a non-stationary process ( a1 = 1.3 ): 70 60 50 40 30 20 10 0 1
2
Helle Bunzel Introduction to Time Series Analysis
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations and their solutions. Now suppose there is no initial observation. We had t 1
t 1
yt = a0
∑ a1i + a1t yt
t
+
∑ a1i εt
i.
i =0
i =0
Continue iterating backwards: ∞
yt
= a0
a1i yt ∑ a1i + ilim !∞
∞
i
i =0
= a0
i
i =0
∞
1 1
+ ∑ a1i εt
a1
+ ∑ a1i εt
i
i =0
if ja1 j < 1
Now a brief pause from di¤erence equations to consider lag operators. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
The Lag operator. The Lag operator is de…ned as: Li yt
yt
i
Properties: Lc = c Li + Lj yt = Li yt + Lj yt = yt i j
i
j
i
L L yt = L L yt = L yt
j
= yt
i
+ yt j i
j
= Li +j yt
L0 yt = yt L i yt = yt Helle Bunzel Introduction to Time Series Analysis
( i)
= yt +i ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
The Lag operator. Properties: 1 + aL + a2 L2 + a3 L3 + ... yt =
∞
∑ ai Li yt = 1
i =0
1 aL
yt if jaj < 1
Proof: First work on the left side: (1 aL) 1 + aL + a2 L2 + a3 L3 + ... yt 1 + aL + a2 L2 + a3 L3 + ... yt
=
aL + a2 L2 + a3 L3 + .. yt
= yt The right side: (1
aL)
Helle Bunzel Introduction to Time Series Analysis
1 1
aL
yt = yt ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
The Lag operator.
Finally, if jaj > 1 1 + (aL) 1 1
aL
1
yt =
Helle Bunzel Introduction to Time Series Analysis
+a
2
2
L
(aL)
1
+a ∞
∑a
3
i
L
3
+ ... yt =
aL yt , 1 aL
L i yt
i =0
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
The Lag operator. 1 1
aL
yt =
1
(aL)
∞
∑a
i
L i yt
i =0
Proof:
(1
aL)
(aL)
1
∞
∑a
i
L i yt
i =0
=
(aL)
1
=
(aL)
1
+ (aL) = yt
1
Helle Bunzel Introduction to Time Series Analysis
(1
aL) 1 + (aL)
1 + (aL)
1
aL 1 + (aL)
+ (aL) 1
1
!
+ (aL) 2
+ (aL)
+ (aL) 2
2
+ (aL) 3
+ (aL)
3
+ ... yt
+ ... yt 3
+ ... yt ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Using Lag Operators Using this new tool, we can more compactly describe di¤erence equations: yt = ao + a1 yt
1
+ a2 yt
2
+ ... + ap yt
p
+ wt
can be written as 1
a1 L
a2 L2
...
ap Lp yt = ao + wt
or A (L) yt = ao + wt where A (L) = 1 Helle Bunzel Introduction to Time Series Analysis
a1 L
a2 L2
...
ap Lp ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Using Lag Operators We also write A (1) = 1
a1
a2
...
ap
Use the Lag operators to solve equations: yt = ao + a1 yt
+ wt yt = ao + a1 Lyt + wt (1 a1 L) yt = ao + wt
Helle Bunzel Introduction to Time Series Analysis
1
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Using Lag Operators
(1 yt
a1 L) yt = ao + wt ,
= =
yt
=
1 1 ao + wt ( 1 a1 L ) ( 1 a1 L ) ∞ ao + ∑ a1i Li wt 1 a1 i = 0 ∞
ao 1
a1
+ ∑ a1i wt
i
i =0
Same solution as we’ve found before. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round The iterative solution method is …ne if we have …rst order di¤erence equations. For higher order we need better methods.
Example: Second order equation: yt = 0.6yt
1
0.08yt
2
yt
1
+ 0.08yt
2
1
0.6yt
2
0.6L + 0.08L
+ wt , = wt ,
yt = wt
Now, for any second order polynomial a + bx + x 2 with roots α1 and α2 , we can write a + bx + x 2 = (x Helle Bunzel Introduction to Time Series Analysis
α1 ) (x
α2 ) ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round I AR(2) Example
Find the roots in our example: 1
αi =
0.6L + 0.08L2
7.5
q
Helle Bunzel Introduction to Time Series Analysis
1 0.08
= 0.08
0.6 L + L2 0.08
= 0.08 12.5 7.5L + L2 = 0.08 (L α1 ) (L α2 ) (7.5)2 2
4 12.5
=
2.5 5
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round II AR(2) Example
We can then write the equation yt = 0.6yt
1
+ wt , 5) yt = wt ,
0.08yt
0.08 (L
2.5) (L
0.08 (L
2.5) yt = (L
1 yt = (L 0.08
2.5)
1
2
1
5)
(L
wt ,
5)
1
wt
Now, recall that if jaj < 1
(1
aL)
1
yt =
Helle Bunzel Introduction to Time Series Analysis
∞
1 1
aL
yt =
∑ ai Li yt
i =0
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round III AR(2) Example
So,
(L
5)
1
wt
=
1 L
5
wt =
1 5 (1
1 wt 0.2L) ∞
∞
0.2 ∑ (0.2)i wt
0.2 ∑ (0.2)i Li wt =
=
i
i =0
i =0
Also
(L
2.5)
1
wt
i
=
1 2.5 1
=
0.4 ∑ (0.4)j Lj wt
∞
1 wt 0.4L
i
i
j =0 ∞
=
0.4 ∑ (0.4)j wt
i j
j =0
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) Example
Put the pieces together: n o 1 yt = (L 2.5) 1 (L 5) 1 wt 0.08 n o 0.2 ∞ 1 i = L 2.5 w 0.2 ( ) ( ) t i 0.08 i∑ =0
=
∞ 0.2 0.4 ∞ (0.2)i ∑ (0.4)j wt ∑ 0.08 i =0 j =0 ∞
=
i j
∞
∑ ∑ (0.2)i (0.4)j wt
i j
i =0 j =0
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) Example
Note that for this calculation to work, we needed following assumptions: 1 1 0.6L + 0.08L2 has two real roots 2 j α1 j > 1 AND j α2 j > 1.
First we set up a general second order model. Later we look at what happens when the assumptions are violated: yt = a1 yt
1
+ a2 yt
2
Rewrite yt a1 yt
1
a 2 yt
2
2
1 a1 L a2 L Find α1 and α2 . Helle Bunzel Introduction to Time Series Analysis
yt
+ wt = wt = wt ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) General
First suppose Assumption 1 is OK. Then: 1
a1 L
a2 L2 yt = wt ,
a1 1 L a2 a2 a2 ( L α 1 ) ( L
a2
L2 yt = wt , α2 ) yt = wt
If Assumption 2 is not violated, then we can invert these: yt =
1 (L a2
Helle Bunzel Introduction to Time Series Analysis
α2 )
1
(L
α1 )
1
wt
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) Example
Use:
(1 (L
aL)
α2 )
1
1
yt =
1
=
wt
∞
1
=
aL
yt =
∑ ai Li yt
i =0
1 L
α2 1 α2
1 1 α2 1 α1 L 2
wt =
∞
∑
i =0
i
1 α2
Li wt =
1 α2
∞
∑
i =0
1 α2
i
wt
i
Similarly:
(L
α1 )
1
wt
Helle Bunzel Introduction to Time Series Analysis
i
=
1 α1
∞
∑
j =0
1 α1
j
wt
i j
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round I AR(2) General
Put the pieces together: 1 yt = (L α2 ) a2
= = =
1 1 a2 α 2
∞
∑
i =0 ∞
1
(L
1 α2
1 1 1 a2 α 1 α 2
i =0
1 1 a2 α 1 α 2
∑∑
Helle Bunzel Introduction to Time Series Analysis
1 α2
∑ ∞
∞
i =0 j =0
α1 )
1
wt
i
α2 )
1
∑
1 α1
j
j
1 α2
(L i ∞
j =0
1 α1
wt wt
i
i j
i
wt
i j
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round II AR(2) General
Now recall that αi are the roots of
αi =
a1 a2
such that
r 0
B α1 α2 = B @
Helle Bunzel Introduction to Time Series Analysis
L2 , so
4 a12
2
a1 a2
a1 a2
=
2
a1 a2
a1 a2 L
1 a2
+
2
r
a1 a2
2
2 a1 a2
4
2
4 a12
10 CB CB A@
4 a12
=
a1 a2
r
a1 a2
2
2
1 4 a12 C C A
1 a2 ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round III AR(2) General
so …nally we get yt
= =
1 1 a2 α 1 α 2 1 1 a2 a1
2
∞
=
∞
∑∑ ∞
∞
1 α1
1 α1 j
j
1 α1
i =0 j =0
∑∑
i =0 j =0
Introduction to Time Series Analysis
∞
i =0 j =0
∑∑
Helle Bunzel
∞
1 α2
j
i
1 α2
wt
1 α2
i
wt
i j
wt
i j
i j
i
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) General
Now, suppose Assumption 1 was violated. Then we would have two complex roots Complex roots:
αi =
a1
q
a12 + 4a2
2a2
,
where a12 + 4a2 < 0
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) General
Now, recall i = αi
=
=
p
a1
1, so q a12 + 4a2 2a2
a1
Helle Bunzel Introduction to Time Series Analysis
i
q
=
(a12 + 4a2 ) 2a2
q
a1
=
( 1) ( (a12 + 4a2 ))
a1 2a2
2a2
i
q
(a12 + 4a2 ) 2a2
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) General
αi =
a1 2a2
De…ne α = β =
i
q a1 2a2 q
(a12 + 4a2 ) 2a2
(a12 + 4a2 ) 2a2
So αi = α
iβ
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) General
Then 1 a2 ( L
a1 L
(α
a2 L2 yt = wt , i β)) (L
(α + i β)) yt = wt
What happens if we simply treat these the way we did the real roots? 1 1 wt (L (α + i β)) 1 wt = 1 (α + i β) 1 L α + i β ( )
=
Helle Bunzel Introduction to Time Series Analysis
∞ 1 ∑ ( α + i β ) k =0
1 (α + i β)
k
wt
k
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Di¤erence equations, Final Round AR(2) General
So, similar to the situation with real roots, we need k 1 lim ! 0, k !∞ (α + i β ) lim (α + i β)k
k !∞
Formula: ( α + i β )k
(α where
i β )k
! ∞
= R k [cos (θk ) + i sin (θk )] = R k [cos (θk )
R = jα + i βj =
q
i sin (θk )]
α2 + β2
So, as before, we need the norm of the root to be greater than 1. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Summary, AR(2) model The model is: yt = a1 yt
+ a2 yt 2 + wt , α1 ) (L α2 ) yt = wt
α2 (L
1
It is stationary when jα1 j and jα2 j are greater than 1. When it is stationary, ∞
yt =
∞
∑∑
i =0 j =0
1 α1
j
1 α2
i
wt
i j
When it is not, we can still …nd yt from initial conditions. We need 2 values of y ! Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation Second order di¤erence equation
This model too, can be writte in matrix form:
= a1 yt 1 + a2 yt 2 + wt , yt a1 a2 yt 1 wt = + yt 1 1 0 yt 2 0 Find the eigenvalues of yt
F =
jF
a1 a2 1 0 λI2 j =
Helle Bunzel Introduction to Time Series Analysis
a1
λ 1
a2 λ
=
λ ( a1
λ)
a2 = 0 ,
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation Second order di¤erence equation
λ2
a2 = 0
a1 λ
Compare to the polynomial in the lag operator: 1 1 L
a2 L2
a1 L
2
a1
1 L
= 0,
a2 = 0
The polynomial λ2 a1 λ a2 = 0 is called the characteristic polynomial. The roots are exactly α11 and α12 , so here stability is if roots are less than 1. Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation I p’th order di¤erence equation
Consider (changing to Hamilton notation) yt = φ1 yt Then de…ne 2
1
yt
+ φ2 yt
6 yt 1 6 ξt = 6 6 yt 2 4 : yt p +1
Helle Bunzel Introduction to Time Series Analysis
2
+ φ3 yt
3
+ ... + φp yt
p
+ wt
3 7 7 7 7 5 ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation II p’th order di¤erence equation
and the p
p matrix:
2
6 6 F =6 6 4
and the p
2
Helle Bunzel
6 6 vt = 6 6 4
φ1 φ2 φ3 1 0 0 0 1 0 : : : 0 0 0 1 vector: 3 wt 0 7 7 0 7 7 : 5 0
Introduction to Time Series Analysis
... φp ... 0 ... 0 ... : ... 1
1
φp 0 0 : 0
3 7 7 7 7 5
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation III p’th order di¤erence equation
Then we can write ξt = F ξt which is 2 yt 6 yt 1 6 6 yt 2 6 4 : yt p +1
3
1
+ vt
2
7 6 7 6 7=6 7 6 5 4
Helle Bunzel Introduction to Time Series Analysis
φ1 φ2 φ3 1 0 0 0 1 0 : : : 0 0 0
... φp ... 0 ... 0 ... : ... 1
1
φp 0 0 : 0
32 76 76 76 76 54
yt yt yt
1 2 3
: yt
p
3
2
7 6 7 6 7+6 7 6 5 4
wt 0 0 : 0
3 7 7 7 7 5 ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation IV p’th order di¤erence equation
This is a system of p equations which contain the original di¤erence equation as well as p 1 equations of the form yt
j
= yt
j
Why write it like this? Because we now have a (multivariate) …rst order di¤erence equation! Solution:
(1
FL) ξ t = vt ∞
ξt =
∑ F i vt
i
i =0
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation V p’th order di¤erence equation
If you consider the …rst row of this one: ∞
yt =
∑
i =0
n
Fi
o
11
wt
i
where fX gij denotes the ij 0 th entry of the matrix X .
Note that the dynamic multiplier is dyt dwt
= Fj j
Helle Bunzel Introduction to Time Series Analysis
11
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation I Properties of F
Recall that F is a p
p matrix.
First assume that F has p distinct eigenvalues λ1 , ..., λp . Then there exists a non-singular p p matrix T such that F = T ΛT
1
where 2
6 6 Λ=6 6 4
λ1 0 0 0 λ2 0 0 0 λ3 : : : 0 0 0
Helle Bunzel Introduction to Time Series Analysis
... 0 ... 0 ... 0 ... : ... λp
3 7 7 7 7 5 ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation II Properties of F
Then F 2 = T ΛT
1
T ΛT
1
= T Λ2 T
1
and Fj
= T ΛT 1 T ΛT = T Λ Λ ... ΛT = T Λj T 1
Helle Bunzel Introduction to Time Series Analysis
... T ΛT
1
1
1
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation III Properties of F
Also recall that 2 j λ1 0 0 6 j 6 0 λ2 0 6 Λj = 6 0 0 λj3 6 4 : : : 0 0 0
Helle Bunzel Introduction to Time Series Analysis
... 0 ... 0 ... 0 ... : ... λjp
3 7 7 7 7 7 5
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation IV Properties of F
Now let us try to …nd F j F j = T Λj T 2
6 6 = 6 6 4 2
Helle Bunzel
6 6 6 6 4
11
.
1
t11 t12 t13 t21 t22 t23 t31 t32 t33 : : tp1 tp2 tp3
... ... ... . ...
t1p t2p t3p : tpp
t 11 t 12 t 13 t 21 t 22 t 23 t 31 t 32 t 33 : : p1 p2 t t t p3
... ... ... . ...
t 1p t 2p t 3p : pp t
Introduction to Time Series Analysis
32 76 76 76 76 56 4 3 7 7 7 7 5
λj1 0 0 0 λj2 0 0 0 λj3 : : : 0 0 0
... 0 ... 0 ... 0 ... : ... λjp
3 7 7 7 7 7 5
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation V Properties of F
2
6 6 6 = 6 6 4 2 6 6 6 6 4
t11 λj1 t12 λj2 t13 λj3 t21 λj1 t22 λj2 t23 λj3 t31 λj1 t32 λj2 t33 λj3 : : tp1 λj1 tp2 λj2 tp3 λj3 t 11 t 12 t 13 t 21 t 22 t 23 t 31 t 32 t 33 : : t p1 t p2 t p3
Helle Bunzel Introduction to Time Series Analysis
... ... ... . ...
... ... ... . ... 1p 3
t t 2p t 3p : t pp
t1p λjp t2p λjp t3p λjp : tpp λjp
3 7 7 7 7 7 5
7 7 7 7 5 ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation VI Properties 2 of F
6 6 =6 6 4
t11 λj1 t 11 + t12 λj2 t 21 + t13 λj3 t 31 + ... + t1p λjp t p1 ? ? ? ?
? ? ? ? ?
? ? ? ? ?
? ? ? ? ?
? ? ? ? ?
In conclusion, p
Fj
11
=
∑ t1i λji t ii =
i =1
3 7 7 7 7 5
p
∑ t1i t i 1 λji
i =1
This is a weighted sum of the eigen values. In particular note that p
∑ t1i t i 1 = 1
i =1 Helle Bunzel
Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation VII Properties of F
Hamilton simpli…es notation by writing p
Fj
11
=
∑ ci λji
i =1
where we now know that p
∑ ci = 1
i =1
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation VIII Properties of F
Furthermore, as long as the eigenvalues are distinct, it turns out that we can write ci =
λpi
1
p
∏ ( λi
λk )
k =1 k 6 =i
so F j
11
can be found just from the eigenvalues of F !
Again we have stability, also of the p 0 th order system as long as all the eigenvalues have norm less than 1.
Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation I Properties of F
Now assume that F has only s < p distinct eigenvalues. We can again write F = MJM
1
,
but 2
3 J1 0 ... 0 6 0 J2 ... 0 7 7 J=6 4 : : . : 5 0 0 ... Js Helle Bunzel Introduction to Time Series Analysis
ISU
Introduction
Di¤erence Equations
A look back
Di¤erence Equations II
The Lag Operator
Di¤erence Equations III
Matrix Formulation II Properties of F
and
2
6 6 6 Ji = 6 6 6 4
λi 0 0 : 0 0
1 λi 0 : 0 0
NOTE the "1"s!! We still get F j = MJ j M
0 1 λi : 0 0
0 0 1 . .. ..
... 0 ... 0 ... 0 . : λi 1 0 λi
3 7 7 7 7 7 7 5
1
but the expression for J j is messy. See Hamilton for details. Helle Bunzel Introduction to Time Series Analysis
ISU