Beruflich Dokumente
Kultur Dokumente
In this Chapter we introduce an important parametric family of stationary series, the autoregressive moving-average or AMRA, process. 3.1 ARMA( p, q ) process Definition 3.1.1 {X t }is an ARMA( p, q ) process if stationary and if for every t
{X }
t
is
X t 1 X t 1 ... p X t p = Z t + 1 Z t 1 + ... + q Z t q .
Where {Z t } ~ WN (0, 2 ) and the polynomials
(1 Z ...
1
{X }
t
with mean if {X t } is an ARMA( p, q ). We can rewrite the ARMA( p, q ) model by (B ) X t = (B )Z t , where () and () are pth and qth- degree polynomials (Z ) = 1 + 1 Z + ... + q Z q ,and B is the backward shift operator. The time series {X t } is called an moving average process of order q (or MA(q ) ) if (Z ) 1.
{X }
t
of
X t = j Z t j for all t.
j =0
PZ P.
It is equivalent to
j k j k = j , j = 0,1,2,
k =1
An ARMA( p, q ) process
PZ P.
It is equivalent to
j + k j k = j , j = 0,1,2,
k =1
equations X t 0.5 X t 1 = Z t 1 + 0.4Z t 1 , {Z t } ~ WN (0, 2 ), (Z ) = 1 0.5Z has an zero at Z=2.There exists an unique ARMA process satisfies this equation and is also causal. Example 3.1.2 An AR(2) process Let
{X }
t
be the AR(2)
outside the unit circle, we conclude that {X t }is causal AR(2) process. Example 3.1.3 An ARMA(2,1) process Consider the ARMA(2,1) process defined by the equations X t 0.75 X t 1 + 0.625 X t 2 = Z t + 1.25Z t 1 , {Z t } ~ WN (0, 2 ) . The AR polynomial (Z ) = 1 0.75Z + 0.5625 Z 2 has zeros
Z= 2 1 i 3 . The processes therefore causal. On the other hand, the 3
MA polynomial equation (Z ) = 1 + 1.25Z has zero Z=0.8, and hence {X t } is not invertible.
3.2 The ACF and PACF of an ARMA( p, q ) process In this section we discuss three method for computing the autocovriance function () of a causal ARMA process {X t }. The autocorrelation function and partical autocorrelation function (PACF) are also found from the function
() .
3.2.1 Calculation of the ACVF
(B )X t = (B )Z t , Z t ~ WN (0, 2 ),
where j Z j = (Z )
j =0
(Z ) , Z 1 .
(h ) = E ( X t + h , X t ) = 2 j j + h .
j =0
Example 3.2.1. The ARMA(1,1) process Let { X t } satisfy X t X t 1 = Z t + Z t 1 , (Z t } ~ NN (0, 2 ) with 1. So, X t = Z t + ( + ) j 1 Z t j .
j =1
Hence (0) = 2 2 j
j =0
2 = 2 1 + ( + ) 2 j j =0
( + )2 = 1 + , 1 2
2
2 q h , if h q , where 0 = 1 . We have (h ) = j =0 j j + h 0, if h q
Second method. If we multiply each side of the equation
X t 1 X t 1 ... p X t p = Z t + 1 Z t 1 + ... + q Z t q
by X t k , k = 0,1,2,... , and take expections on each side ,we find that (k ) 1 (k 1) ... p (k p ) = 2 k + j j ,0 k m , and
j =0
(k ) 1 (k 1) ... p (k p ) = 0, k m ,
where m = max( p, q + 1) , j:= 0 for j 0, 0 := 1 and j := 0 for j {0,..., q}.
It can be shown that (h ) to be of the form
h h h (h ) = 1 1 + 2 2 + ... + p p , h m p ,where 1 ,..., p are
Let {X t }be the causal ARMA(1,1) process. We have (0 ) (1) = 2 (1 + ( + )) , (1) (0 ) = 2 , and
(k ) (k 1), k 2 .
So, (h ) = h , h 1.
computed as =
(h ) (0 )
ARMA process {X t } is the function () defined by the equations (0 ) = 1 and (h ) = hh , h 1 .Where component of h = h1 h , h = (i j )
rn = [ (1), (2),..., (h )] .
'
hh
is the last
h i , j =1
and
some i
h p
X h +1
in term
Example 3.2.7 The PACF of an MA(1) process For the MA(1) process, the PACF at lag h is
(h ) = hh =
(1 +
( )
2
+ .... + 2 h )
{x , x ,..., x }
1 2 n
particular, if the sample PACF (h ) is significantly different from zero for 0 h p , and neligible for h p , then we can suggest that an AR( p ) model might provided good representation of the data. We can use that for an AR( p ) process the sample PACF value at lags greater than p are approximately
Series
1 00.
50.
0.
-50.
-1 00.
-1 50. 0 1 0 20 30 40 50
Let
yt : the measure amount of the fuel in the tank at the end of the
t-th day.
at : the measure amount sold minus the amount delivered during
.80
.80
.60
.60
.40
.40
.20
.20
.00
.00
-.20
-.20
-.40
-.40
-.60
-.60
-.80
-.80
-1 .00 0 5 1 0 1 5 20 25 30 35 40
-1 .00 0 5 1 0 1 5 20 25 30 35 40
(B ) X t = (B )Z t , Z t ~ WN (0, 2 ).
Transform
{X }
t
Wt = 1 X t , t = 1,2,..., m to ,where Wt = 1 (B ) X t , t m
m = max( p, q )
Where nj
innovation algorithm.
W W for Xt Xn = t t
t 1.
So,
PnW n + h = ( n + h 1, j (W n + h j W n + h j )).
j =1
n + h 1
= ( n + h 1, j ( X n + h j X n + h j )).
2 j =1
n + h 1
Hence,
n + h 1 n+ h1, j ( X n+h j X n+h j), j =h = p n + h 1 i Pn X n+ h+i + n+ h1, j ( X n+ h j X n+ h j), j =h i =1
1 h m n, h > m n.
Pn X n+ h
If, as is almost always the case, n>m=max(p,q), then for all h>1
i Pn X n+h+i + n+h1, j ( X n+h j X n+h j).
i =1 j =h p n + h 1