Sie sind auf Seite 1von 13

Chapter 3 ARMA model

In this Chapter we introduce an important parametric family of stationary series, the autoregressive moving-average or AMRA, process. 3.1 ARMA( p, q ) process Definition 3.1.1 {X t }is an ARMA( p, q ) process if stationary and if for every t

{X }
t

is

X t 1 X t 1 ... p X t p = Z t + 1 Z t 1 + ... + q Z t q .
Where {Z t } ~ WN (0, 2 ) and the polynomials

(1 Z ...
1

Z p ) and (1 + 1 Z + ... + q Z q ) has no common

factors. The process

{X }
t

is said to be an ARMA( p, q ) process

with mean if {X t } is an ARMA( p, q ). We can rewrite the ARMA( p, q ) model by (B ) X t = (B )Z t , where () and () are pth and qth- degree polynomials (Z ) = 1 + 1 Z + ... + q Z q ,and B is the backward shift operator. The time series {X t } is called an moving average process of order q (or MA(q ) ) if (Z ) 1.

The time series Existence and Uniqueness A stationary solution

{X }
t

of

X t 1 X t 1 ... p X t p = Z t + 1 Z t 1 + ... + q Z t q exists (and is


also the unique stationary solution) if and only if

(Z ) = 1 1 Z ... p Z p 0 for all Z = 1 .


Causality An ARMA( p, q ) process {X t } is causal or causal function of

{Z }, if there exist constant { }


t

such that j and


j =0

X t = j Z t j for all t.
j =0

Causality is equivalent to the condition

(Z ) = 1 1 Z ... p Z p 0 for all Z 1 .

The sequence { j } is determined by the relation


(1 1 Z ... p Z p )( 0 + 1 Z ) = 1 + 1 Z +

PZ P.

It is equivalent to

j k j k = j , j = 0,1,2,
k =1

where 0 := 1, j := 0 for j>q and j = 0 for j<0. Invertibility

An ARMA( p, q ) process

{X }is invertible if there exists constant ( }


t
j

such that j < and Z t = j X t j for all t.


j =0 j =0

Invertibility is equivalent to (Z ) = 1 + 1 Z + 2 Z 2 + ... + q Z q 0 for all Z 1 .

The sequence { j } is determined by the relation


(1 1 Z ... p Z p )( 0 + 1 Z ) = 1 + 1 Z +

PZ P.

It is equivalent to

j + k j k = j , j = 0,1,2,
k =1

where 0 := 1, j := 0 for j>p and j = 0 for j<0.

Example 3.1.1 An ARMA(1,1) process Consider the


ARMA(1,1) process { X t } satisfies the

equations X t 0.5 X t 1 = Z t 1 + 0.4Z t 1 , {Z t } ~ WN (0, 2 ), (Z ) = 1 0.5Z has an zero at Z=2.There exists an unique ARMA process satisfies this equation and is also causal. Example 3.1.2 An AR(2) process Let

{X }
t

be the AR(2)

process, X t = 0.7 X t 1 0.1X t 2 + Z t , {Z t } ~ WN (0, 2 ) .

(Z ) = 1 0.7 Z + 0.1Z 2 = (1 0.5Z )(1 0.2 Z ) . Since these zero lies

outside the unit circle, we conclude that {X t }is causal AR(2) process. Example 3.1.3 An ARMA(2,1) process Consider the ARMA(2,1) process defined by the equations X t 0.75 X t 1 + 0.625 X t 2 = Z t + 1.25Z t 1 , {Z t } ~ WN (0, 2 ) . The AR polynomial (Z ) = 1 0.75Z + 0.5625 Z 2 has zeros
Z= 2 1 i 3 . The processes therefore causal. On the other hand, the 3

MA polynomial equation (Z ) = 1 + 1.25Z has zero Z=0.8, and hence {X t } is not invertible.

3.2 The ACF and PACF of an ARMA( p, q ) process In this section we discuss three method for computing the autocovriance function () of a causal ARMA process {X t }. The autocorrelation function and partical autocorrelation function (PACF) are also found from the function

() .
3.2.1 Calculation of the ACVF

Let {X t } be a causal ARMA( p, q ) satisfies

(B )X t = (B )Z t , Z t ~ WN (0, 2 ),

where (Z ) = 1 1 Z ... p Z p and (Z ) = 1 + 1 Z + ... + q Z q . The causal assumption implies that X t = j Z t j ,


j =0

where j Z j = (Z )
j =0

(Z ) , Z 1 .

First Method From proposition 2.2.1, we obtain

(h ) = E ( X t + h , X t ) = 2 j j + h .
j =0

Example 3.2.1. The ARMA(1,1) process Let { X t } satisfy X t X t 1 = Z t + Z t 1 , (Z t } ~ NN (0, 2 ) with 1. So, X t = Z t + ( + ) j 1 Z t j .
j =1

Hence (0) = 2 2 j
j =0
2 = 2 1 + ( + ) 2 j j =0

( + )2 = 1 + , 1 2
2

( + )2 and (h ) = h (1) . (1) = j +1 j = + + j =0 1 2


2 2

Example3.2.2. The MA(q ) process Let {X t }satisfy X t = Z t + 1 Z t 1 + ... + q Z t q , {Z t } ~ NN (0, 2 ) .

2 q h , if h q , where 0 = 1 . We have (h ) = j =0 j j + h 0, if h q
Second method. If we multiply each side of the equation

X t 1 X t 1 ... p X t p = Z t + 1 Z t 1 + ... + q Z t q
by X t k , k = 0,1,2,... , and take expections on each side ,we find that (k ) 1 (k 1) ... p (k p ) = 2 k + j j ,0 k m , and
j =0

(k ) 1 (k 1) ... p (k p ) = 0, k m ,
where m = max( p, q + 1) , j:= 0 for j 0, 0 := 1 and j := 0 for j {0,..., q}.
It can be shown that (h ) to be of the form
h h h (h ) = 1 1 + 2 2 + ... + p p , h m p ,where 1 ,..., p are

the roots of (2 ) = 0 and 1... p


Example 3.2.3. The ARMA(1,1) process

Let {X t }be the causal ARMA(1,1) process. We have (0 ) (1) = 2 (1 + ( + )) , (1) (0 ) = 2 , and

(k ) (k 1), k 2 .
So, (h ) = h , h 1.

3.2.2 The autocorrelation function


For any set of observation {X 1 , X 2 ,..., X n }, the sample ACF () is

computed as =

(h ) (0 )

3.2.3 The autocorrelation function


The sample ACF of an MA(q ) model

If the sample ACF (h ) is significantly different from zero for

0 h q and negligible for h > q , MA(q ) model might provided


a good representation of the data .In order apply this criterion, we need to take into account the random variation expected in the sample autocorrelation function before we can classify ACF value as negligible. To resolve this problem, we can use Bartletts formula, which implies that for a large sample of size n from a
MA(q ) process, the sample ACF values at lags greater than q are

approximately normal distribution with mean 0 and variance


Whn n = (1 + 2 (1) + ... + 2 (q ))
2 2

3.2.3 The partial Autocorrelation Function


Definition The partial autocorrelation function (PACF) of an

ARMA process {X t } is the function () defined by the equations (0 ) = 1 and (h ) = hh , h 1 .Where component of h = h1 h , h = (i j )
rn = [ (1), (2),..., (h )] .
'

hh

is the last

h i , j =1

and

For any set of observations {x1 , x 2 ,..., x n }with xi = x j for

some i

and j , the sample PACF (h ) is given by


(0) = 1 and (h ) = hh , h 1 , where hh is the last


component of h = n h . Example3.2.6 The PAVF of a AR( p ) process For the causal AR( p ) process defined by
X t 1 X t 1 ... p X t p = Z t , Z t ~ WN (0, 2 ).
1

We know that for

h p

the best linear predictor of

X h +1

in term

of 1, X 1 ,..., X n is X h +1 = 1 X h + 2 X h 1 + .... + p X h +1 p . Since the coefficient hh of X is p if h = p a nd 0 if h p . Hence, ( p ) = p and (h ) = 0 for h p .

Example 3.2.7 The PACF of an MA(1) process For the MA(1) process, the PACF at lag h is

(h ) = hh =

(1 +

( )
2

+ .... + 2 h )

The sample PACF of an AR( p ) series. If {X t } is an AR( p )

series , then the sample PACF based on observation

{x , x ,..., x }
1 2 n

should reflect the properties of the PACF itself. In

particular, if the sample PACF (h ) is significantly different from zero for 0 h p , and neligible for h p , then we can suggest that an AR( p ) model might provided good representation of the data. We can use that for an AR( p ) process the sample PACF value at lags greater than p are approximately

1 independent N 0, random variance to descried what is n


meant by negligible. Example 3.2.8 The time series plotted in the following figure consists of 57 consecutive daily overshorts from an underground gasoline tank at a filling station in Colorado.

Series

1 00.

50.

0.

-50.

-1 00.

-1 50. 0 1 0 20 30 40 50

Let
yt : the measure amount of the fuel in the tank at the end of the

t-th day.
at : the measure amount sold minus the amount delivered during

the course of the t-th day. So, we can define


xt = yt yt 1 + at .

We should show the plausibility of an MA(1) model. Since (h) is


well thin these bound 1.96(1 + 2 (1) 2 ) / n 2 . The data
1

compatible with the model


X t = + Z t + Z t 1 . Let := x57 . Use the equations (1 + 2 ) 2 = r (0) = 3415.72 and

2 = r (1) = 1719.95 , we can find the approximation

solution = 1 and 2 = 1708 ., we obtain the nonivertible MA(1) model X t = 4.035 + Z t Z t 1 .


1 .00 Sample ACF 1 .00 Sample PACF

.80

.80

.60

.60

.40

.40

.20

.20

.00

.00

-.20

-.20

-.40

-.40

-.60

-.60

-.80

-.80

-1 .00 0 5 1 0 1 5 20 25 30 35 40

-1 .00 0 5 1 0 1 5 20 25 30 35 40

3.3 Forecasting ARMA process For the causal ARMA process

(B ) X t = (B )Z t , Z t ~ WN (0, 2 ).
Transform

{X }
t

Wt = 1 X t , t = 1,2,..., m to ,where Wt = 1 (B ) X t , t m

m = max( p, q )

Let 0 := 1 and j := 0 for j q .


Let k (i. j ) = E (Wi ,W j ). Then

2 X (i j ),1 i. j m 2 X (i j ) X ( i j ), min(i, j ) m max(i, j ) 2m k (i. j ) = q , min(i. j ) m = 0 + i j 0, otherwise

Applying the innovation algorithm the process {Wt }


W n +1 = n (W nj n +1 j Wn +1 j ),1 n m j =1 . We have q W n +1 = nj Wn +1 j W n +1 j , n m j =1

Where nj

and n = E Wn +1 W n +1 can be found by

innovation algorithm.
W W for Xt Xn = t t

t 1.

So,

n X X n+1 j , 1 n m nj n+1 j j =1 X n+1 = q 1 X n + ... + p X n+1 p + nj X n+1 j X n+1 j , n > m j =1

h-step prediction of an ARMA(p,q) process


Using the recursive calculation of the h-step predictors for

innovations algorithm, we have

PnW n + h = ( n + h 1, j (W n + h j W n + h j )).
j =1

n + h 1

= ( n + h 1, j ( X n + h j X n + h j )).
2 j =1

n + h 1

Hence,
n + h 1 n+ h1, j ( X n+h j X n+h j), j =h = p n + h 1 i Pn X n+ h+i + n+ h1, j ( X n+ h j X n+ h j), j =h i =1

1 h m n, h > m n.

Pn X n+ h

If, as is almost always the case, n>m=max(p,q), then for all h>1
i Pn X n+h+i + n+h1, j ( X n+h j X n+h j).
i =1 j =h p n + h 1

Das könnte Ihnen auch gefallen