Sie sind auf Seite 1von 65

Stationary models

MA, AR and ARMA


Matthieu Stigler

November 14, 2008

Version 1.1

This document is released under the Creative Commons Attribution-Noncommercial 2.5 India
license.

Matthieu Stigler ()

Stationary models

November 14, 2008

1 / 65

Lectures list
1

Stationarity

ARMA models for stationary variables

Seasonality

Non-stationarity

Non-linearities

Multivariate models

Structural VAR models

Cointegration the Engle and Granger approach

Cointegration 2: The Johansen Methodology

10

Multivariate Nonlinearities in VAR models

11

Multivariate Nonlinearities in VECM models

Matthieu Stigler ()

Stationary models

November 14, 2008

2 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

3 / 65

Recall: auto-covariance

Definition (autocovariance)
Cov(Xt , Xtk ) k (t) E [(Xt )(Xtk )]

Definition (Autocorrelation)
Corr(Xt , Xtk ) k (t)

Cov(Xt ,Xtk )
Var(Xt )

Proposition
Corr(Xt , Xt0 ) = Var(Xt )
Corr(Xt , Xtj ) = j depend on the lage: plot its values at each lag.

Matthieu Stigler ()

Stationary models

November 14, 2008

4 / 65

Recall: stationarity

The stationarity is an essential property to define a time series process:

Definition
A process is said to be covariance-stationary, or weakly stationary, if
its first and second moments are time invariant.
E(Yt ) = E[Yt1 ] =
Var(Yt ) = 0 <
Cov(Yt , Ytk ) = k

Matthieu Stigler ()

Stationary models

t
t
t, k

November 14, 2008

5 / 65

Recall: The AR(1)


The AR(1): Yt = c + Yt1 + t
t iid(0, 2 )
with || < 1, it can be can be written as:
t1

X
c
i ti
+
Yt =
1
i=0

Its moments do not depend on the time: :


E(Xt ) =

c
1

Var(Xt ) =

2
12
j
2
12
= j

Cov(Xt , Xtj ) =
Corr(Xt , Xtj )

Matthieu Stigler ()

Stationary models

November 14, 2008

6 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

7 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

8 / 65

Autocorrelation function

A usefull plot to understand the dynamic of a process is the


autocorrelation function:
Plot the autocorrelation value for different lags.

Matthieu Stigler ()

Stationary models

November 14, 2008

9 / 65

0.0
6

10 12 14

Lag k

Lag k

= 0.9

=1

10 12 14

0.4
0.0

0.0

0.4

0.8

0.8

0.4

0.4
0.0

0.8

= 0.5

0.8

=0

10 12 14

Lag k
Matthieu Stigler ()

10 12 14

Lag k
Stationary models

November 14, 2008

10 / 65

AR(1) with 1 < < 0

in the AR(1): Yt = c + Yt1 + t


with 1 < < 0
we have negative autocorrelation.

Matthieu Stigler ()

t iid(0, 2 )

Stationary models

November 14, 2008

11 / 65

phi=0.7

ar
3

2 1

ar

phi=0.2

40

60

80

100

40

60

80

Time

= 0.2

= 0.7

100

0.5
1.0

0.0

0.5
0.0
1.0

20

Time

1.0

20

1.0

10 12 14

Lag k
Matthieu Stigler ()

10 12 14

Lag k
Stationary models

November 14, 2008

12 / 65

Definition (AR(p))
yt = c + 1 yt1 + 2 yt2 + . . . + p ytp + t
Expectation?
Variance?
Auto-covariance?
Stationary conditions?

Matthieu Stigler ()

Stationary models

November 14, 2008

13 / 65

Lag operator

Definition (Backshift /Lag operator)


LXt = Xt1

Proposition
See that: L2 Xt = Xt2

Proposition (Generalisation)
Lk Xt = Xtk

Matthieu Stigler ()

Stationary models

November 14, 2008

14 / 65

Lag polynomial
We can thus rewrite:

Example (AR(2))
Xt = c + 1 Xt1 + 2 Xt2 + t
(1 1 L 2 L2 )Xt = c + t

Definition (lag polynomial)


We call lag polynomial: (L) = (1 1 L 2 L2 . . . p Lp )
So we write compactly:

Example (AR(2))
(L)Xt = c + t

Matthieu Stigler ()

Stationary models

November 14, 2008

15 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

16 / 65

Definition (Characteristic polynomial)


(1 1 z 2 z 2 . . . p z p )
Stability condition:

Proposition
The AR(p) process is stable if the roots of the lag polynomial lie outside
the unit circle.

Example (AR(1))
The AR(1): Xt = Xt1 + t
can be written as: (1 L)Xt = t
Solving it gives: 1 x = 0 x =
And finally: | 1 | > 1 || < 1

Matthieu Stigler ()

Stationary models

November 14, 2008

17 / 65

Proof.
1

Write an AR(p) as AR(1)

Show conditions for the augmented AR(1)

Transpose the result to the AR(p)

Matthieu Stigler ()

Stationary models

November 14, 2008

18 / 65

Proof.
The AR(p):
yt = 1 yt1 + 2 yt2 + . . . + p ytp + t
can be recast as the AR(1) model:
t = F t1 + t

yt
t
yt1
1 2 3 . . . p1 p

yt1 1 0 0 . . .
0
0 yt2 0

yt2 0 1 0 . . .
0
0
yt3 + 0

=
.. ..
..
..
..
.. .. ..
. .
.
. ...
.
. . .
0
ytp
0 0 0 ...
1
0
ytp+1

yt
= c + 1 yt1 + 2 yt2 + . . . + p ytp + t

y
= yt1
t1

...

y
tp+1 = ytp+1
Matthieu Stigler ()

Stationary models

November 14, 2008

19 / 65

Proof.
Starting from the augmented AR(1) notation:
t = F t1 + t
Similarly as in the simple case, we can write the AR model recursively:
t = F t 0 + t + F t1 + F 2 t2 + . . . + F t1 1 + F t 0
Remember the eigenvalue decomposition: F
and the propriety that: F j = T j T 1
with
j
1 0 . . .
0 j . . .

2
j = .
..
..
. ...
0

= T T 1

0
0
..
.

. . . j3

So the AR(1) model is stable if |i | < 1 i


Matthieu Stigler ()

Stationary models

November 14, 2008

20 / 65

Proof.
So the condition on F is that all from |F I | = 0 are < 1.
One can show that the eigenvalues of F are:

Proposition
p 1 p1 2 p2 . . . p1 p = 0
But the are the reciprocal of the values z that solve the characteristic
polynomial of the AR(p):
(1 1 z 2 z 2 . . . p z p ) = 0
So the roots of the polynomial should be > 1, or, with complex values,
outside the unit circle.

Matthieu Stigler ()

Stationary models

November 14, 2008

21 / 65

Stationarity conditions

The conditions of roots outside the unit circle lead to:


AR(1): || < 1
AR(2):
I
I
I

1 + 2 < 1
1 2 < 1
|2 | < 1

Matthieu Stigler ()

Stationary models

November 14, 2008

22 / 65

Example
Consider the AR(2) model:
Yt = 0.8Yt1 + 0.09Yt2 + t
Its AR(1) representation is:

 

  
yt
0.8 0.09 yt1

=
+ t
yt1
1
0
yt2
0
Hence
its eigenvalues
are taken from:


0.8 0.09

= 2 0.8 0.09 = 0
1
0
And the eigenvalues are smaller than one:
> Re(polyroot(c(-0.09, -0.8, 1)))

[1] -0.1

0.9

Matthieu Stigler ()

Stationary models

November 14, 2008

23 / 65

Example
Yt = 0.8Yt1 + 0.09Yt2 + t
Its lag polynomial representation is: (1 0.8L 0.09L2 )Xt = t
Its characteristic polynomial is hence: (1 0.8x 0.09x 2 ) = 0
whose solutions lie outside the unit circle:
> Re(polyroot(c(1, -0.8, -0.09)))

[1]

1.111111 -10.000000

And it is the inverse of the previous solutions:


> all.equal(sort(1/Re(polyroot(c(1, -0.8, -0.09)))), Re(polyroot(c(-0.09,
+
-0.8, 1))))

[1] TRUE

Matthieu Stigler ()

Stationary models

November 14, 2008

24 / 65

Unit root and integration order


Definition
A process is said to be integrated of order d if it becomes stationary after
being differenced d times.

Proposition
An AR(p) process with k unit roots (or eigenvalues) is integrated of order
k.

Example
Take the random walk: Xt = Xt1 + t
Its polynomial is (1-L), and the roots is 1 x = 0 x = 1
The eigenvalue of the trivial AR(1) is 1 = 0 = 1
So the random walk is integrated of order 1 (or difference stationary).
Matthieu Stigler ()

Stationary models

November 14, 2008

25 / 65

Integrated process
Take an AR(p):
yt = 1 yt1 + 2 yt2 + . . . + p ytp + t
With the lag polynomial:
(L)Xt = t
If one of its p (not necessarily distinct) eigenvalues is equal to 1, it can be
rewritten:
(1 L)0 (L)Xt = t
Equivalently:
0 (L)Xt = t

Matthieu Stigler ()

Stationary models

November 14, 2008

26 / 65

The AR(p) in detail

Moments of a stationary AR(p)


E(Xt ) =

c
11 2 ...p

Var(Xt ) = 1 1 + 2 2 + . . . + p p + 2
Cov(Xt , Xtj ) = 1 j1 + 2 j2 + . . . + p jp
Note
that j Cov(Xt , Xtj ) so we can rewrite both last equations as:
(
0 = 1 1 + 2 2 + . . . + p p + 2
j = 1 j1 + 2 j2 + . . . + p jp
They are known under the name of Yule-Walker equations.

Matthieu Stigler ()

Stationary models

November 14, 2008

27 / 65

Yule-Walker equations
Dividing by 0 gives:
(
0 = 1 1 + 2 2 + . . . + p p + 2
j = 1 j1 + 2 j2 + . . . + p jp

Example (AR(1))
We saw that:
Var(Xt ) =

2
12
j
2
12
= j

Cov(Xt , Xtj ) =
Corr(Xt , Xtj )

And we have effectively: 1 = 0 = and 2 = 1 = 2


Utility:
Determination of autocorrelation function
Estimation
Matthieu Stigler ()

Stationary models

November 14, 2008

28 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

29 / 65

To estimate a AR(p) model from a sample of T we take t=T-p


Methods of moments: estimate sample moments (the i ), and find
parameters (the ) correspondly
Unconditional ML: assume yp , . . . , y1 N (0, 2 ). Need numerical
optimisation methods.
Conditional Maximum likelihood (=OLS): estimate
f (yT , tT 1 , . . . , yp+1 |yp , . . . , y1 ; ) and assume t N (0, 2 ) and
that yp , . . . , y1 are given
What if errors are not normally distributed? Quasi-maximum likelihood
estimator, is still consistent (in this case) but standard erros need to be
corrected.

Matthieu Stigler ()

Stationary models

November 14, 2008

30 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

31 / 65

Moving average models

Two significations!
regression model
Smoothing technique!

Matthieu Stigler ()

Stationary models

November 14, 2008

32 / 65

MA(1)
Definition (MA(1))
Yt = c + t + t1
E(Yt ) = c
Var(Yt ) = (1 + 2 ) 2
(
2
Cov(Xt , Xtj ) =
0
(
Corr(Xt , Xtj ) =

if j = 1
if j > 1

(1+2 )

if j = 1

if j > 1

Proposition
A MA(1) is stationnary for every

Matthieu Stigler ()

Stationary models

November 14, 2008

33 / 65

= 0.5

0 2 4 6 8

=2

20

40

60

80

100

20

40

60

Time

Time

= 0.5

= 2

80

100

80

100

1 2 3

20

40

60

80

100

20

Time
Matthieu Stigler ()

40

60

Time
Stationary models

November 14, 2008

34 / 65

0.0
6

10

Lag k

Lag k

= 0.5

= 3

10

10

0.5
0.0
1.0

1.0

0.0

0.5

1.0

1.0

0.4

0.4
0.0

0.8

=3

0.8

= 0.5

10

Lag k
Matthieu Stigler ()

6
Lag k

Stationary models

November 14, 2008

35 / 65

MA(q)
The MA(q) is given by:
Yt = c + t + 1 t1 + 2 t2 + . . . + 1 tq
E(Yt ) = c
Var(Yt ) = (1 + 12 + 22 + . . . + q2 ) 2
Cov(X
t , Xtj ) =
(
2
(j + j+1 1 + j+2 2 + . . . + q q1 )
0
(

if j = 1
(1+2 )
Corr(Xt , Xtj ) =
0
if j > 1

if j = 1
if j > 1

Proposition
A MA(q) is stationary for every sequence {1 , 2 , . . . , q }
Matthieu Stigler ()

Stationary models

November 14, 2008

36 / 65

= c( 0.5,, 1.5)

0.5
1.0
2

10

10

= c( 0.6,, 0.3,, 0.5,, 0.5)

= c( 0.6,, 0.3,, 0.5,, 0.5,, 3,, 2,, 1)

0.5
0.0
1.0

0.0

0.5

1.0

Lag k

1.0

Lag k

1.0

0.0

0.4
0.0

0.8

1.0

= c(0.5,, 1.5)

10

Lag k
Matthieu Stigler ()

10

Lag k
Stationary models

November 14, 2008

37 / 65

The MA()
Take now the MA():
Yt = t + 1 t1 + 2 t2 + . . . +  =

j tj

j=0

Definition (Absolute summability)


A sequence is absolute summable if

i=0 |i |

<0

Proposition
The MA() is stationary if the coefficients are absolute summable.

Matthieu Stigler ()

Stationary models

November 14, 2008

38 / 65

Back to AR(p)
Recall:

Proposition
If the characteristic polynomial of a AR(p) has roots =1, it is not
stationary.
See that:
(1 1 yt1 2 yt2 . . . p ytp )yt =
(1 1 L)(1 2 L) . . . (1 p L)yt = t
It has a MA() representation if: 1 6= 1:
yt = (11 L)(112 L)...(1p L) t
Furthermore, if the i (the eigenvalues of the augmented AR(1)) are
smaller than 1, we can write it:
yt =

i t

i=0

Matthieu Stigler ()

Stationary models

November 14, 2008

39 / 65

Estimation of a MA(1)

We do not observe neither t nor t1


But if we know 0 , we know 1 = Yt 0
So obtain
and minimize the conditional SSR:
Pthem recursively
2
S() = T
(y

)
t1
t=1 t
This recquires numerical optimization and works only if || < 1.

Matthieu Stigler ()

Stationary models

November 14, 2008

40 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

41 / 65

ARMA models
The ARMA model is a composite of AR and MA:

Definition (ARMA(p,q))
Xt = c+1 Xt1 +2 Xt2 +. . .+p Xtp +t1 +1 t1 +2 t2 +. . .+tq
It can be rewritten properly as:
(L)Yt = c + (L)t

Theorem
The ARMA(p,q) model is stationary provided the roots of the (L)
polynomial lie outside the unit circle.
So only the AR part is involved!

Matthieu Stigler ()

Stationary models

November 14, 2008

42 / 65

Autocorrelation function of a ARMA(p,q)

Proposition
After q lags, the autocorrelation function follows the pattern of the AR
component.
Remember: this is then given by the Yule-Walker equations.

Matthieu Stigler ()

Stationary models

November 14, 2008

43 / 65

phi(1)=0.5, theta(1)=0.5

0.5
1.0

0.0

0.4
0.0

0.8

1.0

phi(1)=0.5, theta(1)=0.5

10

Lag k

10

Lag k

1.0
0.5
0.0
1.0

0.0
1.0

0.5

1.0

phi(1)=0.5, theta(1:3)=c(0.5,0.9,0.3) phi(1)=0.5, theta(1:3)=c(0.5,0.9,0.3)

10

Lag k
Matthieu Stigler ()

10

Lag k
Stationary models

November 14, 2008

44 / 65

ARIMA(p,d,q)

Now we add a parameter d representing the order of integration (so the I


in ARIMA)

Definition (ARIMA(p,d,q))
ARIMA(p,d,q): (L)d Yt = (L)t

Example (Special cases)


White noise: ARIMA(0,0,0) Xt = t
Random walk : ARIMA(0,1,0): Xt = t Xt = Xt1 + t

Matthieu Stigler ()

Stationary models

November 14, 2008

45 / 65

Estimation and inference

The MLE estimator has to be found numerically.


Provided the errors are normaly distributed, the estimator has the usual
asymptotical properties:
Consistent
Asymptotically efficients
Normally distributed
If we take into account that the variance had to be estimated, one can
rather use the T distribution in small samples.

Matthieu Stigler ()

Stationary models

November 14, 2008

46 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

47 / 65

The Box-Jenkins approach

Transform data to achieve stationarity

Identify the model, i.e. the parameters of ARMA(p,d,q)

Estimation

Diagnostic analysis: test residuals

Matthieu Stigler ()

Stationary models

November 14, 2008

48 / 65

Step 1
Transformations:
Log
Square root
Differenciation
(
Box-Cox transformation:

()
Yt

Yt 1

log(Y t)

for 6= 0
for = 0

Is log legitimate?
Process is: yt e t Then zt = log(yt ) = t and remove trend
Process is yt = yt1 + t Then (by log(1 + x)
= x)
log(yt ) =

Matthieu Stigler ()

yt yt1
yt

Stationary models

November 14, 2008

49 / 65

Step 2

Identification of p,q (d should now be 0 after convenient transformation)


Principle of parsimony: prefer small models Recall that
incoporating variables increases fit (R 2 ) but reduces the degrees of
freedom and hence precision of estimation and tests.
A AR(1) has a MA() representation
If the MA(q) and AR(p) polynomials have a common root, the
ARMA(p,q) is similar to ARMA(p-1,q-1).
Usual techniques recquire that the MA polynomial has roots outside
the unit circle (i.e. is invertible)

Matthieu Stigler ()

Stationary models

November 14, 2008

50 / 65

Step 2: identification

How can we determine the parameters p,q?


Look at ACF and PACF with confidence interval
Use information criteria
I
I

Akaike Criterion (AIC)


Schwarz criterion (BIC)

Definition (IC)
AIC (p) = n log
2 + 2p
BIC (p) = n log
2 + p log n

Matthieu Stigler ()

Stationary models

November 14, 2008

51 / 65

Step 3: estimation

Estimate the model...


R function: arima() argument: order=c(p,d,q)

Matthieu Stigler ()

Stationary models

November 14, 2008

52 / 65

Step 4: diagnostic checks

Test if the residuals are white noise:


1

Autocorrelation

Heteroscedasticity

Normality

Matthieu Stigler ()

Stationary models

November 14, 2008

53 / 65

80
60
40
20

CPI

100

120

140

CPI

1985

1990

1995

2000

2005

Time
Matthieu Stigler ()

Stationary models

November 14, 2008

54 / 65

diff(CPI)

20

1
2 1

diff(CPI)

100
60

CPI

140

original

1995

2005

1985

1995

Time

Time

log(CPI)

diff(log(CPI))

2005

0.02
0.02

0.00

diff(log(CPI))

4.0
3.5
3.0

log(CPI)

4.5

5.0

1985

1985

1995

2005

1985

Time
Matthieu Stigler ()

1995

2005

Time
Stationary models

November 14, 2008

55 / 65

detrend

20 60

CPI

120

linear trend

1985

1990

1995

2000

2005

1985

1990

Time

1995

2000

2005

2000

2005

2000

2005

Time

1985

1990

1995

2000

6
2
2

CPI

20 60

120

CPI smo$y

Smooth trend

2005

1985

1990

Time

1995
Time

10
5
0

detrend2

20 60

CPI

120

Quadratic trend

1985

1990

1995
Time

Matthieu Stigler ()

2000

2005

1985

1990

Stationary models

1995
Time

November 14, 2008

56 / 65

1
3

diff2

Diff2

1985

1990

1995

2000

2005

2000

2005

Time

0.00
0.03

diff2log

Diff2 of log

1985

1990

1995
Time

Matthieu Stigler ()

Stationary models

November 14, 2008

57 / 65

0.3

0.1

Partial ACF

0.03 0.01

CPI2

0.01

0.1

Series CPI2

1985

1995

2005

0.5

Time

1.0

1.5

2.0

Lag

0.0
0.2

ACF

0.2

Series CPI2

0.5

1.0

1.5

2.0

Lag
Matthieu Stigler ()

Stationary models

November 14, 2008

58 / 65

> library(forecast)

This is forecast 1.17


> fit <- auto.arima(CPI2, start.p = 1, start.q = 1)
> fit

Series: CPI2
ARIMA(2,0,1)(2,0,2)[12] with zero mean
Coefficients:
ar1
ar2
0.2953 -0.2658
s.e. 0.0630
0.0578

ma1
-0.9011
0.0304

sar1
0.6021
0.1067

sar2
0.3516
0.1051

sma1
-0.5400
0.1286

sma2
-0.2850
0.1212

sigma^2 estimated as 4.031e-05: log likelihood = 1146.75


AIC = -2277.46
AICc = -2276.99
BIC = -2247.44
> res <- residuals(fit)

Matthieu Stigler ()

Stationary models

November 14, 2008

59 / 65

1985

1995

2005

0.00
0.10

Partial ACF

0.00
0.02

res

0.02

0.10

Series res

0.5

Time

1.0

1.5

2.0

Lag

0.05
0.15

ACF

0.05

Series res

0.5

1.0

1.5

2.0

Lag
Matthieu Stigler ()

Stationary models

November 14, 2008

60 / 65

> Box.test(res)

Box-Pierce test
data: res
X-squared = 0.0736, df = 1, p-value = 0.7862

Matthieu Stigler ()

Stationary models

November 14, 2008

61 / 65

0.01

0.02

Sample Quantiles

Normal QQ Plot

Theoretical Quantiles

40
0

Density

80

density.default(x = res)

0.02

0.01

0.00

0.01

0.02

N = 315 Bandwidth = 0.001481


Matthieu Stigler ()

Stationary models

November 14, 2008

62 / 65

Outline
1

Last Lecture

AR(p) models
Autocorrelation of AR(1)
Stationarity Conditions
Estimation

MA models
ARMA(p,q)
The Box-Jenkins approach

Forecasting

Matthieu Stigler ()

Stationary models

November 14, 2008

63 / 65

Notation (Forecast)
yt+j Et (yt+j ) = E(yt+j |yt , yt1 , . . . , t , t1 , . . .) is the conditional
expectation of yt+j given the information available at t.

Definition (J-step-ahead forecast error)


et (j) yt+j yt+j

Definition (Mean square prediction error)


MSPE

1
H

PH

2
i=1 ei

Matthieu Stigler ()

Stationary models

November 14, 2008

64 / 65

R implementation

To run this file you will need:


R Package forecast
R Package TSA
Data file AjaySeries2.csv put it in a folder called Datasets in the same
level than your.Rnw file
(Optional) File Sweave.sty which change output style: result is in
blue, R commands are smaller. Also in same folder as .Rnw file.

Matthieu Stigler ()

Stationary models

November 14, 2008

65 / 65

Das könnte Ihnen auch gefallen