Sie sind auf Seite 1von 5

Time Series Analysis Midterm Exam Kaiji Motegi

Spring 2016 Solutions Waseda University

Problem-1: Consider GARCH(1,1):

i.i.d.
yt = t = t t , t (0, 1), t2 = + 2t1 + t1
2
,

> 0, 0, 0, + < 1.

(a) Show that Et1 [t ] = 0. (Remark: This result indicates that {t } is a martingale
dierence sequence.)

(b) Show that Et1 [2t ] = t2 . (Remark: This result implies that {t } is not an i.i.d.

sequence.)

Solution-1: (a) Et1 [t ] = Et1 [t t ] = t Et1 [t ] = 0. The second equality holds since t

is known at time t 1, while the last equality holds since {t } is i.i.d. with mean 0.

(b) Et1 [2t ] = Et1 [t2 t2 ] = t2 Et1 [t2 ] = t2 . The second equality holds since t2 is known

at time t 1, while the last equality holds since {t } is i.i.d. with E[t2 ] = 1.

Problem-2: Consider AR(2):

yt = 1 yt1 + 2 yt2 + t , (1)

where {t } is white noise with E[2t ] = 2 . Assume that {yt } is covariance stationary.1 Dene

variance 0 = E[yt2 ], autocovariance j = E[yt ytj ], and autocorrelation j = j /0 for lag


j 1. Below we use the Yule-Walker equation in order to compute j .

(a) Show that 0 = 1 1 + 2 2 + 2 .

(b) Show that 1 = 1 0 + 2 1 .

(c) Show that 2 = 1 1 + 2 0 .


1
Technically, {yt } is covariance stationary if and only if |1 | < 1 2 and |2 | < 1. The covariance
stationarity condition of AR(2) will be elaborated in Chapter 5.

1
Time Series Analysis Midterm Exam Kaiji Motegi
Spring 2016 Solutions Waseda University

(d) Show that j = 1 j1 + 2 j2 for j 2.

(e) Show that


j = 1 j1 + 2 j2 , j 2, (2)

where 0 = 1. (Hint: Use part (d).)

(f) Show that

(1 2 ) 2 1 2
0 = , 1 = ,
(1 + 2 )[(1 2 )2 21 ] (1 + 2 )[(1 2 )2 21 ]

and
1
1 = . (3)
1 2

(Hint: Combine parts (a), (b), and (c).) (Remark: Given Eqs. (2) and (3), we can

compute j recursively.)

(g) Suppose that (1 , 2 ) = (0.5, 0.3). Compute (1 , 2 , 3 , 4 , 5 ).

Solution-2: (a) Multiply yt and take expectations on both sides of Eq. (1) to get

0 = 1 1 + 2 2 + E[(1 yt1 + 2 yt2 + t )t ] = 1 1 + 2 2 + 2 . (4)

(b) Multiply yt1 and take expectations on both sides of Eq. (1) to get

1 = 1 0 + 2 1 . (5)

(c) Multiply yt2 and take expectations on both sides of Eq. (1) to get

2 = 1 1 + 2 0 . (6)

2
Time Series Analysis Midterm Exam Kaiji Motegi
Spring 2016 Solutions Waseda University

(d) Let j 2. Multiply ytj and take expectations on both sides of Eq. (1) to get

j = 1 j1 + 2 j2 . (7)

(e) Divide both sides of Eq. (7) by 0 to get j = 1 j1 + 2 j2 .


(f) Eq. (5) implies that
1
1 = 0 . (8)
1 2

(Note that this equation is well-dened since |2 | < 1 under covariance stationarity.) Sub-

stitute Eq. (8) into Eq. (6) to get

21 + (1 2 )2
2 = 0 . (9)
1 2

Substitute Eqs. (8) and (9) into Eq. (4) to obtain 0 = (1 2 ) 2 /(1 + 2 )[(1 2 )2 21 ].
(Note that this equation is well-dened since |1 | < 1 2 and |2 | < 1 under covariance

stationarity.) Substitute this into Eq. (8) to get 1 = 1 2 /(1 + 2 )[(1 2 )2 21 ]. Hence

we have that 1 = 1 /0 = 1 /(1 2 ).

(g) (1 , 2 , 3 , 4 , 5 ) = (0.385, 0.108, 0.169, 0.052, 0.025). See Figure 1 for your vi-

sual understanding. (Remark: Non-monotone autocorrelations are possible under AR(2),


while they are impossible under AR(1).)

Problem-3: Consider i.i.d. time series {yt } with E[yt ] = 0 and E[yt2 ] = 2 < for any t.
Suppose that you have T observations {y1 , y2 , . . . , yT } and compute the sample autocovari-
ance at lag h:
1
T
h = yt yth , h = 1, 2, . . .
T t=h+1

3
Time Series Analysis Midterm Exam Kaiji Motegi
Spring 2016 Solutions Waseda University

0.5

-0.5
1 2 3 4 5
Lag j

Figure 1: Population Autocorrelations of AR(2) Processes

and the sample autocorrelation at lag h:

h
h = , h = 1, 2, . . . ,
0

1
T
where 0 = T t=1 yt2 .

In this problem we analyze the asymptotic properties of {h }.2 It is known in the litera-
ture that
d
T h N (0, 1) (10)

for each h 1 as T .3 Eq. (10) is a useful result for hypothesis testing for the i.i.d.
property. Consider two cases below.

(a) Suppose that you observe 1 = 0.2 from your data with size T = 400. In this case do
you believe that your dataset is indeed a draw from an i.i.d. process? Briey explain.

(b) Suppose that sample size is T = 492 = 2401. Show that the 95% condence interval
2
Asymptotic means large sample (i.e. T ).
d
3
means convergence in distribution. Eq. (10) states that T h converges to a standard normal
random number (in probability sense) as sample size grows.

4
Time Series Analysis Midterm Exam Kaiji Motegi
Spring 2016 Solutions Waseda University

of 1 is approximately [0.04, 0.04]. (Hint: The upper 2.5% point of N (0, 1) is 1.96.)
(Important Remark: Given the large sample size T = 2401, observing a seemingly
tiny autocorrelation (e.g. 1 = 0.05) would actually convince you to reject the i.i.d.
hypothesis at the 5% level.)


Solution-3: (a) Under the i.i.d. hypothesis, T 1 must follow N (0, 1) in large sample. We

however observe T 1 = 4, which is too large for us to believe that it is a draw from N (0, 1).

(Recall that a draw from N (0, 1) is smaller than 3 in absolute value with probability 0.997.)
Hence we reject the i.i.d. hypothesis.

(b) Since T 1 N (0, 1) under the i.i.d. hypothesis, we have that


P r[1.96 < T 1 < 1.96] = 0.95

and hence

P r[1.96/ T < 1 < 1.96/ T ] = 0.95.


Since T = 49, P r[0.04 < 1 < 0.04] = 0.95. The 95% condence interval of 1 is therefore

[0.04, 0.04].

Das könnte Ihnen auch gefallen