Autoregression AR (P)
Autoregression AR (P)
Autoregression AR (P)
STATIONARY TS MODELS
where {Zt } is white noise, i.e., {Zt } W N(0, 2 ), and Zt is uncorrelated with
Xs for each s < t.
Remark 4.12. We assume (for simplicity of notation) that the mean of Xt is zero.
If the mean is E Xt = 6= 0, then we replace Xt by Xt to obtain
where
= (1 1 . . . p ).
Xt = T Xt1 + Zt .
(1 1 B 2 B 2 . . . p B p )Xt = Zt
4.5. AUTOREGRESSIVE PROCESSES AR(P) 75
(B)Xt = Zt , (4.21)
(B) = 1 1 B 2 B 2 . . . p B p .
Then the AR(p) can be viewed as a solution to the equation (4.21), i.e.,
1
Xt = Zt . (4.22)
(B)
4.5.1 AR(1)
According to Definition 4.7 the autoregressive process of order 1 is given by
Xt = Xt1 + Zt , (4.23)
Corollary 4.1 says that an infinite combination of white noise variables is a sta-
tionary process. Here, due to the recursive form of the TS we can write AR(1) in
such a form. Namely
Xt = Xt1 + Zt
= (Xt2 + Zt1 ) + Zt
= 2 Xt2 + Zt1 + Zt
..
.
k1
X
= k Xtk + j Ztj .
j=0
What would we obtain if we have continued the backwards operation, i.e., what
happens when k ?
76 CHAPTER 4. STATIONARY TS MODELS
where (B) = j
P
j=0 j B . It means we want to find the coefficients j . Substi-
tuting Zt from the AR model into the linear process model we obtain
Xt = (B)Zt = (B)(B)Xt . (4.24)
In full, the coefficients of both sides of the equation can be written as
1 = (1 + 1 B + 2 B 2 + 3 B 3 + . . .)(1 B)
= 1 + 1 B + 2 B 2 + 3 B 3 + . . . B 1 B 2 2 B 3 3 B 4 . . .
= 1 + (1 )B + (2 1 )B 2 + (3 2 )B 3 + . . .
Now, equating coefficients of B j on the LHS and RHS of this equation we see
that all the coefficients of B j must be zero, i.e.,
1 =
2 = 1 = 2
3 = 2 = 3
..
.
j = j1 = j .
4.5. AUTOREGRESSIVE PROCESSES AR(P) 77
Remark 4.13. Note, that from the equation (4.24) it follows that (B) is an inverse
of (B), that is
1
(B) = . (4.25)
(B)
For an AR(1) we have
1
(B) = = 1 + B + 2 B 2 + 3 B 3 + . . . (4.26)
1 B
As a linear process AR(1) is stationary with mean
X
E Xt = j E(Ztj ) = 0 (4.27)
j=0
However, the infinite sum in this expression is the sum of a geometric progression
as || < 1, i.e.,
X 1
2j = .
j=0
1 2
2
( ) = . (4.28)
1 2
Then the variance of AR(1) is
2
(0) = .
1 2
Hence, the autocorrelation function of AR(1) is
( )
( ) = = . (4.29)
(0)
78 CHAPTER 4. STATIONARY TS MODELS
-2
-4
-2
-4
Figure 4.7: Simulated AR(1) processes for = 0.9 (top) and for = 0.9
(bottom).
1.0
0.8
0.5
0.6
ACF
ACF
0.4
0.0
0.2
-0.5
0.0
0 20 40 60 80 100 0 20 40 60 80 100
Lag Lag
(a) (b)
Figure 4.8: Sample ACF for AR(1): (a) xt = 0.9xt1 +zt and (b) xt = 0.9xt1 +
zt .
4.5. AUTOREGRESSIVE PROCESSES AR(P) 79
2
1
0
-1
-2
2
1
0
-1
-2
30 80 130 180
Figure 4.9: Simulated AR(1) processes for = 0.5 (top) and for = 0.5
(bottom).
1.0
0.8
0.8
0.6
0.6
0.4
0.4
ACF
ACF
0.2
0.2
0.0
0.0
-0.2
-0.2
-0.4
0 20 40 60 80 100 0 20 40 60 80 100
Lag Lag
(a) (b)
Figure 4.10: Sample ACF for AR(1): (a) xt = 0.5xt1 + zt and (b) xt =
0.5xt1 + zt .
80 CHAPTER 4. STATIONARY TS MODELS
Figures 4.7, 4.9 and 4.8, 4.10 show simulated AR(1) processes for four different
values of the coefficient (equal to -0.9, 0.9, -0.5 and 0.5) and the respective sam-
ple ACF functions.
Looking at these graphs we can see that for positive coefficient we obtain more
smooth TS than for the negative one. Also, the ACFs are very different. We see
that if is negative the neighboring observations are negatively correlated, but
those two time points apart are positively correlated. In fact, if is negative the
neighboring TS values have typically opposite signs. This is more evident if is
close to -1.
Xt = Xt1 + Zt , (4.30)
where Zt is a white noise variable with zero mean and constant variance 2 . The
model has the same form as AR(1) process, but since = 1, it is not stationary.
Such process is called Random Walk.
Xt = Xt1 + Zt
= Xt2 + Zt1 + Zt
= Xt3 + Zt2 + Zt1 + Zt
= ...
t1
X
= X0 + Ztj .
j=0
If the initial value, X0 , is fixed, then the mean value of Xt is equal to X0 , that is
" t1
#
X
E Xt = E X0 + Ztj = X0 .
j=0
So, the mean is constant, but as we see below, the variance and covariance depend
on time, not just on lag. The white noise variables Zt are uncorrelated, hence we
4.5. AUTOREGRESSIVE PROCESSES AR(P) 81
0
AR1.5
-2
-4
-6
-8
30 80 130 180
obtain !
t1
X
var(Xt ) = var X0 + Ztj
j=0
t1
!
X
= var Ztj
j=0
t1
X
= var(Ztj ) = t 2
j=0
and !
t1
X t
X 1
cov(Xt , Xt ) = cov Ztj , Zt k
j=0 k=0
" t1
! t 1
!#
X X
=E Ztj Zt k
j=0 k=0
2
= |t | .
A simulated series of this form is shown in Figure 4.11.
82 CHAPTER 4. STATIONARY TS MODELS
1.0
2
0.8
1
0.6
ACF
0
0.4
0.2
-1
0.0
-2
-0.2
-3 0 20 40 60 80 100
0 50 100 150 200 t Lag
(a) (b)
Figure 4.12: (a) Differenced Random Walk xt and (b) its sample ACF.
As we can see the random walk meanders away from its starting value in no par-
ticular direction. It does not exhibit any clear trend, but at the same time is not
stationary.
However, the first difference of random walk is stationary as it is just white noise,
namely
Xt = Xt Xt1 = Zt .
The differenced random walk and its sample ACF are shown in Figure 4.12.
as
Xt = 1 Xt+1 1 Zt+1 .
4.5. AUTOREGRESSIVE PROCESSES AR(P) 83
90
70
50
AR1.6
30
10
-10
30 80 130 180
Xt = 1 Xt+1 1 Zt+1
= 1 (1 Xt+2 1 Zt+2 ) 1 Zt+1
= 2 Xt+2 2 Zt+2 1 Zt+1
= ...
k1
X
k
= Xt+k j Zt+j
j=1
which is a future dependent stationary TS. This however, does not have any prac-
tical meaning because it requires knowledge of future values to predict the future.
When a process does not depend on the future, such as AR(1) when || < 1, we
say that it is causal.
Figure 4.13 shows a simulated series xt = 1.02xt1 + zt . As we can see the values
of the time series quickly become large in magnitude, even for just slightly
above 1. Such process is called explosive. This is not a causal TS.