Sie sind auf Seite 1von 9

STAT 420 Fall 2010

Homework #12
(due Friday, December 10, by 3:00 p.m.)

1. Consider the AR ( 2 ) processes


& – 0.3 Y
& &
Y t t – 1 – 0.1 Y t – 2 = e t
where { e t } is zero-mean white noise ( i.i.d. N ( 0, σ e2 ) ), Y& t = Y t – µ .
a) Based on a series of length N = 100, we observe …, y 98 = 152, y 99 = 156,
y 100 = 147, y = 150. Forecast y 101 and y 102 .

YN + 1 = µ + φ1 (YN – µ ) + φ2 (YN – 1 – µ ) + eN + 1

yˆ N +1 = E N ( Y N + 1 ) = µ + φ 1 ( y N – µ ) + φ 2 ( y N – 1 – µ )

YN + 2 = µ + φ1 (YN + 1 – µ ) + φ2 (YN – µ ) + eN + 2

yˆ N + 2 = E N ( Y N + 2 ) = µ + φ 1 ( yˆ N +1 – µ ) + φ 2 ( y N – µ )

ŷ 101 = µ̂ + φˆ1 ( y 100 – µ̂ ) + φˆ 2 ( y 99 – µ̂ )


= 150 + 0.3 ( 147 – 150 ) + 0.1 ( 156 – 150 ) = 149.7.

ŷ 102 = µ̂ + φˆ1 ( ŷ 101 – µ̂ ) + φˆ 2 ( y 100 – µ̂ )


= 150 + 0.3 ( 149.7 – 150 ) + 0.1 ( 147 – 150 ) = 149.61.

b) Use Yule-Walker equations to find ρ 1 and ρ 2 .

The Yule-Walker equations for an AR(2) process:


ρ1 = φ1 + φ2 ρ1
ρ2 = φ1 ρ1 + φ2

φ1 0 .3
ρ1 = φ1 + φ2 ρ1 ⇒ ρ1 = = = 1 .
1−φ 2 1 − 0. 1 3

⇒ ρ 2 = φ 1 ρ 1 + φ 2 = 0.3 ⋅ 1/3 + 0.1 = 0.20.


c) Is this process stationary?

Φ ( B ) = 1 – 0.3 B – 0.1 B 2 = ( 1 – 0.5 B ) ( 1 + 0.2 B )


The roots of Φ ( z ) = 0 are z 1 = 2 and z 2 = – 5.

Both are outside the unit circle. That is, | z 1 | > 1, | z 2 | > 1.

⇒ The process is stationary.

OR

An AR(2) model is stationary if


–1 < φ 2 < 1, φ 2 + φ 1 < 1, φ 2 – φ 1 < 1.
– 1 < 0.1 < 1, 0.1 + 0.3 < 1, 0.1 – 0.3 < 1.
⇒ This process is stationary.

2. Consider the AR ( 2 ) process

Yt = µ + φ1 (Yt – 1 – µ ) + φ2 (Yt – 2 – µ ) + et

Based on a series of length N = 60, we observe …, y 59 = 190, y 60 = 215, y = 200.

a) Suppose r 1 = 0.40, r 2 = – 0.26. Use Yule-Walker equations to estimate φ 1 and φ 2.

The Yule-Walker equations for an AR ( 2 ) process are given by:

ρ1 = φ1 + φ2 ρ1
ρ2 = φ1 ρ1 + φ2

0.40 = φ 1 + 0.40 φ 2 ×5 2 = 5 φ1 + 2 φ2
– 0.26 = 0.40 φ 1 + φ 2 ×2 – 0.52 = 0.80 φ 1 + 2 φ 2

⇒ 2.52 = 4.2 φ 1 ⇒ φˆ1 = 2.52/4.2 = 0.60

⇒ 0.40 = 0.60 + 0.40 φ 2 – 0.20 = 0.40 φ 2

⇒ φˆ 2 = – 0.50.
b) If φ 1 and φ 2 are equal to your answers to part (a), is this process stationary?

& – 0.60 Y
& &
Y t t − 1 + 0.50 Y t − 2 = e t

Φ ( z ) = 1 – 0.60 z + 0.50 z 2 = 0.

0.60 ± 0.60 2 − 4 ⋅1 ⋅ 0.50


Roots z 1,2 = = 0.60 ± − 1.64 = 0.60 ± i 1.64
2 ⋅ 0.50

are outside of the unit circle: z 1,2


2
= 0.60 2 + ( 1.64 ) 2 = 2 > 1.
⇒ This process is stationary.

OR

An AR(2) model is stationary if


–1 < φ 2 < 1, φ 2 + φ 1 < 1, φ 2 – φ 1 < 1.

– 1 < – 0.50 < 1, – 0.50 + 0.60 < 1, – 0.50 – 0.60 < 1.

⇒ This process is stationary.

c) Use your answers to part (a) to forecast y 61 , y 62 , and y 63 .

y 59 = 190, y 60 = 215, y = 200.

ŷ 61 = µ̂ + φˆ1 ( y 60 – µ̂ ) + φˆ 2 ( y 59 – µ̂ )
= 200 + 0.60 ( 215 – 200 ) – 0.50 ( 190 – 200 ) = 214.

ŷ 62 = µ̂ + φˆ1 ( ŷ 61 – µ̂ ) + φˆ 2 ( y 60 – µ̂ )
= 200 + 0.60 ( 214 – 200 ) – 0.50 ( 215 – 200 ) = 200.9.

ŷ 63 = µ̂ + φˆ1 ( ŷ 62 – µ̂ ) + φˆ 2 ( ŷ 61 – µ̂ )
= 200 + 0.60 ( 200.9 – 200 ) – 0.50 ( 214 – 200 ) = 193.54.
3. 10.24 (a) Consider the ARMA ( 1, 1 ) model

( Y t – 60 ) + 0.3 ( Y t – 1 – 60 ) = e t – 0.4 e t – 1
which was fitted to a time series where the last 10 values are
60, 57, 52, 59, 62, 59, 63, 67, 61, 58

and the last residual is ê N = – 2.

Calculate the forecasts of the next two observations, and indicate how forecasts can be
calculated for lead times greater than two. Show what happens to the forecasts as the
lead time becomes arbitrarily large.

( Y N + 1 – 60 ) = – 0.3 ( Y N – 60 ) + e N + 1 – 0.4 e N

yˆ N +1 = E N ( Y N + 1 ) = 60 – 0.3 ( y N – 60 ) – 0.4 ê N
= 60 – 0.3 ( 58 – 60 ) – 0.4 ( – 2 ) = 61.4.

( Y N + 2 – 60 ) = – 0.3 ( Y N + 1 – 60 ) + e N + 2 – 0.4 e N + 1
yˆ N + 2 = E N ( Y N + 2 ) = 60 – 0.3 ( yˆ N +1 – 60 )
= 60 – 0.3 ( 61.4 – 60 ) = 59.58.

( Y N + l – 60 ) = – 0.3 ( Y N + l – 1 – 60 ) + e N + l – 0.4 e N + l – 1

l>2 yˆ N + l – 60 = – 0.3 ( yˆ N + l −1 – 60 )

⇒ yˆ N + l – 60 = ( – 0.3 ) l – 1 ( yˆ N +1 – 60 )

⇒ yˆ N + l = 60 + 1.4 × ( – 0.3 ) l – 1
⇒ yˆ N + l → 60 as l → ∞
For fun:

b)* Given σˆ e2 = 4, calculate 90-percent probability limits for the next two observations.
Interpret these limits.

1
Y t – 60 = ⋅ ( 1 − 0 .4 B ) e t
1 + 0 .3 B
( )
= 1 − 0.3B + 0.09 B 2 − 0.027 B 3 + 0.0081B 4 − 0.00243B 5 + ... ⋅ ( 1 − 0.4 B ) e t

= ( 1 – 0.70 B + 0.21 B 2 – 0.063 B 3 + … ) e t

= e t – 0.70 e t – 1 + 0.21 e t – 2 – 0.063 e t – 3 + …

Var ( Y N + 1 – yˆ N +1 ) = σ e2 . σˆ e2 = 4.
61.4 ± 1.645 × 4 61.4 ± 3.29

Var ( Y N + 2 – yˆ N + 2 ) = ( 1 + ( – 0.70 ) 2 ) σ e2 .
( 1 + ( – 0.70 ) 2 ) σˆ e2 = 5.96.
59.58 ± 1.645 × 5.96 59.58 ± 4.016
4. https://netfiles.uiuc.edu/stepanov/www/ur.dat contains the U.S.
unemployment rate series. These are seasonally adjusted quarterly rates from 1948-1978.

ur = scan(" ... /ur.dat")


ur.ts = ts(ur, start=1948, frequency=4) #make a time series object
par(mfrow=c(3,1)) #3 plots per page
plot(ur.ts) #plots time series
title("U.S. Unemployment Rate")

a) Model Identification:

acf(ur.ts)
pacf(ur.ts)

Based on the sample ACF and PACF, what is an appropriate model?

b) Estimation:

Fit an AR ( 2 ) model to the data using MLE.

AR ( 2 ): ( Yt – µ ) – φ1 ( Yt – 1 – µ ) – φ2 ( Yt – 2 – µ ) = e t

E ( e t ) = 0, Var ( e t ) = σ e2 for all t

E ( e t e t' ) = 0, for t ≠ t'

E ( e t Y t' ) = 0, for t' < t

fit = arima(ur.ts,order=c(2,0,0)) #fits an AR(2) with mean included


fit #prints results

c) Diagnostic Checking:

tsdiag(fit) #diagnostic plots

Based on the first two diagnostics plots, is AR(2) an appropriate model?

d) Forecasting:

Now, we will forecast the next 2 quarters ahead using predict function.

predfit2 = predict(fit,n.ahead=2) #forecasts 2 steps ahead


predfit2 #prints results
> ur <- scan("http://www.stat.uiuc.edu/~stepanov/ur.dat")
Read 121 items
> ur.ts <- ts(ur, start=1948, frequency=4)
> par(mfrow=c(3,1))
> plot(ur.ts)
> title("U.S. Unemployment Rate")

> acf(ur.ts)
> pacf(ur.ts)

ACF smoothly “dies out”.


PACF “cuts off” after lag 2.
This is consistent with an AR(2) model.
> fit <- arima(ur.ts,order=c(2,0,0))
> fit
Call:
arima(x = ur.ts, order = c(2, 0, 0))
Coefficients:
ar1 ar2 intercept
1.5499 -0.6472 5.0815
s.e. 0.0681 0.0686 0.3269
sigma^2 estimated as 0.1276: log likelihood = -48.76, aic = 105.53

> tsdiag(fit)
Standardized Residuals plot does not appear to have noticible patterns. Standardized
Residuals seem “random” and bell-shared (a lot of them are close to zero, with just a few
away from zero). Most of the Standardized Residuals are between – 2 and 2, only one is
outside ( – 3, 3 ).

The ACF of the Residuals looks like the ACF of white noise – the only significant
autocorrelation coefficient is ρ 0 = 1.

This suggests that AR(2) is indeed an appropriate model.

> predfit2 <- predict(fit,n.ahead=2)


> predfit2
$pred
Qtr2 Qtr3
1978 5.812919 5.491268

$se
Qtr2 Qtr3
1978 0.3572328 0.6589087

Das könnte Ihnen auch gefallen