Beruflich Dokumente
Kultur Dokumente
Lawrence C. Marsh
PowerPoint Slides
for
Undergraduate Econometrics
by
Lawrence C. Marsh
To accompany: Undergraduate Econometrics
by R. Carter Hill, William E. Griffiths and George G. Judge
Publisher: John Wiley & Sons, 1997
Chapter 1
Copyright 1996
Lawrence C. Marsh
1.1
The Role of
Econometrics
in Economic Analysis
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
1.2
Copyright 1996
Lawrence C. Marsh
inflation
money supply
Dow-Jones
Stock Index
short term
treasury bills
trade
deficit
unemployment
power of
labor unions
Federal Reserve
Discount Rate
rent
control
laws
1.3
Copyright 1996
Lawrence C. Marsh
Economic Decisions
To use information effectively:
economic theory
economic data
economic
decisions
1.4
Copyright 1996
Lawrence C. Marsh
1.5
c = f(i)
For applied econometric analysis
this consumption function must be
specified more precisely.
Copyright 1996
Lawrence C. Marsh
qd = f( p, pc, ps, i )
1.6
demand
qs = f( p, pc, pf )
supply
Copyright 1996
Lawrence C. Marsh
How
How much
much ??
Listing the variables in an economic relationship is not enough.
For effective policy we must know the amount of change
needed for a policy instrument to bring about the desired
effect:
By how much should the Federal Reserve
raise interest rates to prevent inflation?
By how much can the price of football tickets
be increased and still fill the stadium?
1.7
Copyright 1996
Lawrence C. Marsh
1.8
Copyright 1996
Lawrence C. Marsh
1.9
Copyright 1996
Lawrence C. Marsh
1.10
c = f(i) + e
Systematic part provides prediction, f(i),
but actual will miss by random error, e.
Copyright 1996
Lawrence C. Marsh
1.11
Copyright 1996
Lawrence C. Marsh
1.12
Copyright 1996
Lawrence C. Marsh
1.13
Statistical Models
Controlled (experimental)
vs.
Uncontrolled (observational)
Controlled experiment (pure science) explaining mass, y :
pressure, X2, held constant when varying temperature, X3,
and vice versa.
Uncontrolled experiment (econometrics) explaining consumption, y : price, X2, and income, X3, vary at the same time.
Copyright 1996
Lawrence C. Marsh
Econometric model
economic model
economic variables and parameters.
statistical model
sampling process with its parameters.
data
observed values of the variables.
1.14
Copyright 1996
Lawrence C. Marsh
1.15
Copyright 1996
Lawrence C. Marsh
1.16
Skippy
Chapter 2
Copyright 1996
Lawrence C. Marsh
2.1
Some Basic
Probability
Concepts
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
Random Variable
random variable:
A variable whose value is unknown until it is observed.
The value of a random variable results from an experiment.
The term random variable implies the existence of some
known or unknown probability distribution defined over
the set of all possible values of that variable.
In contrast, an arbitrary variable does not have a
probability distribution associated with its values.
2.2
Copyright 1996
Lawrence C. Marsh
2.3
Copyright 1996
Lawrence C. Marsh
2.4
Copyright 1996
Lawrence C. Marsh
2.5
Copyright 1996
Lawrence C. Marsh
Dummy Variable
2.6
Copyright 1996
Lawrence C. Marsh
2.7
x
1
2
3
4
5
6
f(x)
1/6
1/6
1/6
1/6
1/6
1/6
Copyright 1996
Lawrence C. Marsh
f(x) = P(X=x)
Therefore,
2.8
Copyright 1996
Lawrence C. Marsh
f(x)
0.3
0.2
0.1
0
2.9
Copyright 1996
Lawrence C. Marsh
2.10
green area
0.8676
$34,000
$55,000
Copyright 1996
Lawrence C. Marsh
2.11
Since a continuous random variable has an
uncountably infinite number of values,
the probability of one occurring is zero.
P[X=a] = P[a<X<a]=0
Probability is represented by area.
Height alone has no area.
An interval for X is needed to get
an area under the curve.
Copyright 1996
Lawrence C. Marsh
2.12
P[a<X<b]=
f(x) dx
Copyright 1996
Lawrence C. Marsh
Rules of Summation
n
Rule 1:
x = x1 + x2 + . . . + xn
i=1 i
Rule 2:
axi = ai
= 1xi
i=1
Rule 3:
xi +yi = i
= 1xi + i
= 1yi
i=1
2.13
Copyright 1996
Lawrence C. Marsh
2.14
Rule 4:
axi +byi = ai
= 1xi + bi
= 1yi
i=1
Rule 5:
= n i
= 1xi =
x1 + x2 + . . . + xn
n
xi x) = 0
i=1
Copyright 1996
Lawrence C. Marsh
2.15
Rule 6:
i=1
Notation:
n m
Rule 7:
i=1 j=1
m n
= 1 i
= f(x
1
f(xi,yj) =j
i,yj)
i=1 j=1
Copyright 1996
Lawrence C. Marsh
2.16
Copyright 1996
Lawrence C. Marsh
Expected Value
2.17
1. Empirically:
The expected value of a random variable, X,
is the average value of the random variable in an
infinite number of repetitions of the experiment.
In other words, draw an infinite number of samples,
and average the values of X that you get.
Copyright 1996
Lawrence C. Marsh
Expected Value
2.18
2. Analytically:
The expected value of a discrete random
variable, X, is determined by weighting all
the possible values of X by the corresponding
probability density function values, f(x), and
summing them up.
In other words:
Copyright 1996
Lawrence C. Marsh
2.19
Copyright 1996
Lawrence C. Marsh
2.20
x = i= 1xi
where n is the number of sample observations.
Analytical mean:
n
E[X] = i = 1xif(xi)
where n is the number of possible values of xi.
Notice how the meaning of n changes.
Copyright 1996
Lawrence C. Marsh
EX =
xi f(xi)
i=1
EX =
xi
i=1
f(xi)
EX =
x
i
i=1
f(xi)
2.21
Copyright 1996
EX
Lawrence C. Marsh
2.22
= 1.9
2
= 4.9
3
= 14.5
Copyright 1996
Lawrence C. Marsh
2.23
E [g(X)] =
g(xi)
i=1
f(xi)
E [g(X)] =
i=1
E [g(X)] =
i=1
Copyright 1996
Lawrence C. Marsh
2.24
Copyright 1996
Lawrence C. Marsh
2.25
E(X+a) = E(X) + a
Multiplying by constant will multiply
its expected value by that constant:
E(bX) = b E(X)
Copyright 1996
Variance
Lawrence C. Marsh
2.26
Copyright 1996
Lawrence C. Marsh
2.27
= E [X - 2XEX + (EX) ]
2
= E(X ) - 2 EX EX + E (EX)
2
= E(X ) - (EX)
2
Copyright 1996
Lawrence C. Marsh
2.28
variance of a discrete
random variable, X:
var ( X) =
(xi - EX ) f (xi )
i=1
Copyright 1996
Lawrence C. Marsh
2.29
xi
f(xi)
(xi - EX)
2
3
4
5
6
.1
.3
.1
.2
.3
2 - 4.3 = -2.3
3 - 4.3 = -1.3
4 - 4.3 = - .3
5 - 4.3 = .7
6 - 4.3 = 1.7
5.29 (.1) =
1.69 (.3) =
.09 (.1) =
.49 (.2) =
2.89 (.3) =
.529
.507
.009
.098
.867
i=1 i
i=1
Copyright 1996
Lawrence C. Marsh
2.30
Z = a + cX
var(Z) = var(a + cX)
= E [(a+cX) - E(a+cX)]
2
= c var(X)
2
Copyright 1996
Joint pdf
Lawrence C. Marsh
2.31
Copyright 1996
Lawrence C. Marsh
joint pdf
f(x,y)
vacation X = 0
homes
owned
X=1
college grads
in household
Y=2
Y=1
f(0,1)
.45
f(0,2)
.15
.05
f(1,1)
.35
f(1,2)
2.32
Copyright 1996
Lawrence C. Marsh
2.33
E(XY) = xi yj f(xi,yj)
i
E(XY) = (0)(1)(.45)+(0)(2)(.15)+(1)(1)(.05)+(1)(2)(.35)=.75
Copyright 1996
Lawrence C. Marsh
2.34
Marginal pdf
f(xi) = f(xi,yj)
j
f(yj) = f(xi,yj)
i
Copyright 1996
Lawrence C. Marsh
marginal
Y=1
Y=2
2.35
marginal
pdf for X:
X=0
.45
.15
f
(X
=
0)
.60
X=1
.05
.35
.40 f(X = 1)
marginal
pdf for Y:
.50
.50
f(Y = 2)
f(Y = 1)
Copyright 1996
Lawrence C. Marsh
Conditional pdf
2.36
f(x,y)
f(x|y) = f(y)
f(x,y)
f(y|x) = f(x)
Copyright 1996
Lawrence C. Marsh
2.37
conditonal
Y=1
f(Y=1|X = 0)=.75
X=0
f(X=0|Y=1)=.90 .90
f(X=1|Y=1)=.10 .10
X=1
.75
.45
Y=2
.25
f(Y=2|X= 0)=.25
.60
.15
.05 .35
.30
.70
f(X=0|Y=2)=.30
f(X=1|Y=2)=.70
.40
.125 .875
f(Y=1|X = 1)=.125
.50
.50
f(Y=2|X = 1)=.875
Copyright 1996
Lawrence C. Marsh
Independence
2.38
Copyright 1996
Lawrence C. Marsh
not independent
Y=1
Y=2
.50x.60=.30
.50x.60=.30
2.39
marginal
pdf for X:
X=0
.45
.15
f
(X
=
0)
.60
X=1
.05
.35
.40 f(X = 1)
.50x.40=.20
marginal
pdf for Y:
.50
f(Y = 1)
.50x.40=.20
.50
f(Y = 2)
The calculations
in the boxes show
the numbers
required to have
independence.
Copyright 1996
Lawrence C. Marsh
2.40
Covariance
Copyright 1996
Lawrence C. Marsh
2.41
Y=1
X=0
X=1
Copyright 1996
Y=2
Lawrence C. Marsh
2.42
.45
.15
.60
.05
.35
.40
.50
.50
EY=1(.50)+2(.50)=1.50
EX EY = (.40)(1.50) = .60
EX=0(.60)+1(.40)=.40
covariance
cov(X,Y) = E(XY) - EX EY
= .75 - (.40)(1.50)
= .75 - .60
= .15
E(XY) = (0)(1)(.45)+(0)(2)(.15)+(1)(1)(.05)+(1)(2)(.35)=.75
Copyright 1996
Lawrence C. Marsh
Correlation
2.43
(X,Y) =
cov(X,Y)
var(X) var(Y)
Y=1
Copyright 1996
Y=2
Lawrence C. Marsh
2.44
EX=.40
2
EX=0(.60)+1(.40)=.40
X=0
X=1
.45
.15
.60
.05
.35
.40
cov(X,Y) = .15
.50
EY=1.50
2 2
.50
2
EY=1(.50)+2(.50)
2
var(Y) = E(Y ) - (EY)
= .50 + 2.0
= 2.50 - (1.50)2
= 2.50
= .25
correlation
(X,Y) =
cov(X,Y)
var(X) var(Y)
(X,Y) = .61
Copyright 1996
Lawrence C. Marsh
2.45
Copyright 1996
Since expectation is a linear operator,
it can be applied term by term.
Lawrence C. Marsh
2.46
Copyright 1996
Lawrence C. Marsh
2.47
Copyright 1996
Lawrence C. Marsh
2.48
Y ~ N(, )
2
f(y) =
f(y)
exp
2 2
2
(y
)
2 2
Copyright 1996
Lawrence C. Marsh
Z = (y - )/
Z ~ N(,)
f(z) =
exp
2
z
-
2.49
Copyright 1996
2
Y ~ N(, )
f(y)
P[Y>a]
= P
>
2.50
Y-
Lawrence C. Marsh
a-
= P Z >
y
a-
Copyright 1996
2
Y ~ N(, )
f(y)
P[a<Y<b]
= P
a-
2.51
a-
Lawrence C. Marsh
<
Y-
<Z<
<
b-
b-
Copyright 1996
Lawrence C. Marsh
2.52
W ~ N[ E(W), var(W) ]
Copyright 1996
Lawrence C. Marsh
Chi-Square
2.53
E[V] = E[ (m) ] = m
variance:
Copyright 1996
Student - t
Lawrence C. Marsh
2.54
t=
~ t(m)
E[t] = E[t(m) ] = 0
variance:
var[t]
= var[t(m) ] =
m / (m2)
Copyright 1996
Lawrence C. Marsh
2.55
F Statistic
If V1 ~ (m ) and V2 ~
2
2
(m2)
F=
and if V1 and V2
V1
V2
m1
~ F(m ,m )
1
m2
Chapter 3
Copyright 1996
Lawrence C. Marsh
3.1
Copyright 1996
Lawrence C. Marsh
3.2
Copyright 1996
Lawrence C. Marsh
3.3
Copyright 1996
Lawrence C. Marsh
f(y|x=480)
f(y|x=480)
y|x=480
3.4
f(y|x)
Copyright 1996
Lawrence C. Marsh
f(y|x=480)
f(y|x=800)
y|x=480
y|x=800
3.5
Average
Expenditure
Copyright 1996
E(y|x)
Lawrence C. Marsh
3.6
E(y|x)=1+2x
E(y|x)
x
E(y|x)
2 =
x
1{
x (income)
Figure 3.2 The Economic Model: a linear relationship
between avearage expenditure on food and income.
Copyright 1996
Lawrence C. Marsh
Homoskedastic Case
yt
ex
p
en
di
tu
re
f(yt)
.
.
x1=480
x2=800
income
xt
3.7
Copyright 1996
Lawrence C. Marsh
Heteroskedastic Case
t
f(yt)
d
n
e
p
ex
e
r
itu
.
x1
x2
x3
.
income
xt
3.8
Copyright 1996
Lawrence C. Marsh
3.9
Copyright 1996
Lawrence C. Marsh
3.10
Copyright 1996
Lawrence C. Marsh
3.11
E(y) = 1 + 2x
e = y - E(y)
= y - 1 - 2x
This is called the random error.
Copyright 1996
3.12
y4
e4 {
y3
y2
y1
Lawrence C. Marsh
e2 {.
E(y) = 1 + 2x
.} e3
e1
}
.
x1
x2
x3
x4
Copyright 1996
y^ 3
y2
^e {.
2 .
y^ 1.
y^2
Lawrence C. Marsh
3.13
y4
.
^e {
4
.y^
^y = b + b x
1
2
x4
.} ^e3
.
y
3
^e
}
. 1
y1
x1
x2
x3
Copyright 1996
y^*1.
y^*2
.
^e* {y
2 . 2
y^*3
.
^e* {
3
.
y
Lawrence C. Marsh
. y4
{.
^e*
4
y^*4
3.14
^y = b + b x
1
2
^y*= b* + b* x
1
2
^e*
1
y1.
x1
x2
x3
x4
f(.)
Copyright 1996
Lawrence C. Marsh
f(e)
f(y)
1+2x
3.15
Copyright 1996
Lawrence C. Marsh
3.16
Copyright 1996
Lawrence C. Marsh
Unobservable Nature
of the Error Term
3.17
Copyright 1996
Lawrence C. Marsh
3.18
Copyright 1996
Lawrence C. Marsh
3.19
y t = 1 + 2x t + e t
e t = y t - 1 - 2 x t
Minimize error sum of squared deviations:
T
S(1,2) = t=1
( y t
- 1 - 2x t )2
(3.3.4)
Copyright 1996
Minimize w. r. t. 1 and 2:
Lawrence C. Marsh
3.20
S(1,2) = t=1
(y t
- 1 - 2x t )
(3.3.4)
S()
1 = - 2 (y t - 1 - 2x t )
S()
2 = - 2 x t (y t - 1 - 2x t )
Copyright 1996
Minimize w. r. t. 1 and 2:
Lawrence C. Marsh
3.21
S(.)
S() = t
=1(y t
- 1 - 2 x t )2
S(.)
<
0
i
S(.)
= 0
i
bi
S(.)
.S(.)
>
0
i
Copyright 1996
Lawrence C. Marsh
3.22
S()
1 = - 2 (y t - b1 - b2x t ) = 0
S()
2 = - 2 x t (y t - b1 - b2x t ) = 0
Copyright 1996
- 2 (y t
- b1 - b2x t )
- 2 x t (y t
Lawrence C. Marsh
3.23
= 0
- b1 - b2x t )
= 0
y t - Tb1 - b2 x t = 20
x t y t - b1 x t - b2 xt = 0
Tb1
+ b2 x t
2
b1 x t + b2 xt
= y t
= x t y t
Copyright 1996
Tb1
+ b2 x t
b1 x t + b2 xt
Lawrence C. Marsh
= y t
= x t y t
b2 =
3.24
T x t y t -
x t y t
2
2
T x t - ( x t)
b1 = y - b2 x
and
Copyright 1996
Lawrence C. Marsh
elasticities
y x
y/y
percentage change in y
=
=
=
x y
percentage change in x
x/x
Using calculus, we can get the elasticity at a point:
= lim
x 0
y x
y x
=
x y
x y
3.25
Copyright 1996
Lawrence C. Marsh
applying elasticities
E(y) = 1 + 2 x
E(y)
x
= 2
x
E(y) x
= 2
=
E(y)
x E(y)
3.26
Copyright 1996
Lawrence C. Marsh
estimating elasticities
y x
=
x y
^
3.27
x
= b2
y
y^ t = b1 + b2 x t = 4 + 1.5 x t
x
y
x
= b2
y
^
8
= 1.5 10 = 1.2
Copyright 1996
Lawrence C. Marsh
Prediction
3.28
xt =
^
yt =
If
If
xt =
xt =
= 4 + 1.5 x t
years of experience
predicted wage rate
^
Copyright 1996
Lawrence C. Marsh
log-log models
ln(y) = 1 + 2 ln(x)
ln(y)
ln(x)
= 2
x
x
1 y
y x
= 2
1 x
x x
3.29
Copyright 1996
1 y
y x
= 2
x y
y x
= 2
Lawrence C. Marsh
1 x
x x
3.30
x y
y x
= 2
Chapter 4
Copyright 1996
Lawrence C. Marsh
4.1
Properties of
Least Squares
Estimators
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
4.2
yt = 1 + 2 x t + t
yt = household weekly food expenditures
x t = household weekly income
For a given level of x t, the expected
level of food expenditures will be:
E(yt|x t) =
1 + 2 x t
Copyright 1996
Lawrence C. Marsh
2 = var(yt)
4. cov( i, j) = cov(yi,yj) = 0
5.
6.
4.3
Copyright 1996
Lawrence C. Marsh
2.
4.4
Copyright 1996
Lawrence C. Marsh
4.5
Copyright 1996
Lawrence C. Marsh
4.6
where
Txtyt - xt yt
b2 =
2
2
Txt -(xt)
(3.3.8a)
b1 = y - b2x
(3.3.8b)
y = yt / T
and
x = x t / T
Substitute in
to get:
y = +x +
Txtt - xt t
b2 = 2 +
2
2
Txt -(xt)
TxtEt - xt Et
Eb2 = 2 +
2
2
Txt -(xt)
Since
Et = 0,
then
Eb2 = 2 .
Copyright 1996
Lawrence C. Marsh
An Unbiased Estimator
b2 is an unbiased estimator of 2.
4.8
Copyright 1996
Lawrence C. Marsh
4.9
Copyright 1996
Lawrence C. Marsh
4.10
Eb1 = 1
Copyright 1996
Lawrence C. Marsh
4.11
(4.2.6)
Txtyt xt yt
b2 =
2
2
Txt (xt)
(3.3.8a)
Copyright 1996
Lawrence C. Marsh
4.12
Variance of b2
var(b2) =
x t x
Copyright 1996
Lawrence C. Marsh
4.13
Variance of b1
Given
b1 = y b2x
x
var(b1) = 2
2
x t x
2
t
Copyright 1996
Lawrence C. Marsh
Covariance of b1 and b2
cov(b1,b2) = 2
4.14
x
x t x
Copyright 1996
Lawrence C. Marsh
4.15
Copyright 1996
Lawrence C. Marsh
Gauss-Markov Theorm
4.16
Copyright 1996
Lawrence C. Marsh
implications of Gauss-Markov
1. b1 and b2 are best within the class
of linear and unbiased estimators.
2. Best means smallest variance within
the class of linear/unbiased.
3. All of the first five assumptions must
hold to satisfy Gauss-Markov.
4. Gauss-Markov does not require
assumption six: normality.
5. G-Markov is not based on the least
squares principle but on b1 and b2.
4.17
Copyright 1996
Lawrence C. Marsh
4.18
Copyright 1996
Lawrence C. Marsh
Probability Distribution
of Least Squares Estimators
2
t
b1 ~ N 1 , x x 2
t
b2 ~ N 2 ,
x t x
4.19
Copyright 1996
Lawrence C. Marsh
4.20
b2 = wt yt
x t x
where wt =
2
x t x
b1 = y b2x
Copyright 1996
Lawrence C. Marsh
4.21
Copyright 1996
Lawrence C. Marsh
Consistency
4.22
Copyright 1996
Lawrence C. Marsh
4.23
= yt b1 b2 x t
T
^2
= T2t
^
t =1
^ is an unbiased estimator of 2
Copyright 1996
Lawrence C. Marsh
4.24
^y = b + b x
o
1
2 o
(4.7.2)
Chapter 5
Copyright 1996
Lawrence C. Marsh
5.1
Inference
in the Simple
Regression Model
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
2 = var(yt)
4. cov( i, j) = cov(yi,yj) = 0
5.
6.
5.2
Copyright 1996
Lawrence C. Marsh
Probability Distribution
of Least Squares
Estimators
2
t
b1 ~ N 1 , x x 2
t
b2 ~ N 2 ,
x t x
5.3
Copyright 1996
Lawrence C. Marsh
5.4
^2 =
^
e
^ 2
Copyright 1996
Lawrence C. Marsh
5.5
Copyright 1996
b2 ~ N 2 ,
Lawrence C. Marsh
x t x
b2 2
var(b2)
5.6
Copyright 1996
Lawrence C. Marsh
t ~ N(0,2) .
5.7
Copyright 1996
Lawrence C. Marsh
5.8
Create a Chi-Square
t ~ N(0,2) but want N(0,) .
Chi-Square .
Copyright 1996
Lawrence C. Marsh
5.9
Sum of Chi-Squares
t =1 t /=
1 / 2 / T /
+ +. . .+ =
Therefore,
t =1 t /
Copyright 1996
Lawrence C. Marsh
5.10
Copyright 1996
Lawrence C. Marsh
5.11
Student-t Distribution
t=
~ t(m)
V/m
where Z ~ N(0,1)
and V ~
(m)
2
Copyright 1996
t =
Lawrence C. Marsh
Z
V / ( T2)
where Z =
5.12
~ t(m)
(b2 2)
var(b2)
and var(b2) =
( xi x )2
Copyright 1996
t =
Lawrence C. Marsh
V =
V / (T-2)
(T2)
(b2 2)
t =
5.13
var(b2)
(T2)
^2
( T2)
^2
Copyright 1996
var(b2) =
Lawrence C. Marsh
5.14
( xi x )2
(b2 2)
notice the
cancellations
t =
( xi x )2
^
(T2) 2
2
( T2)
(b2 2)
^2
( xi x )2
Copyright 1996
t =
(b2 2)
Lawrence C. Marsh
5.15
^2
t =
^
var(b )
2
( xi x )
(b2 2)
(b2 2)
se(b2)
Copyright 1996
Lawrence C. Marsh
5.16
Students t - statistic
t =
(b2 2)
se(b2)
~ t (T2)
Copyright 1996
Lawrence C. Marsh
5.17
/2
-tc
/2
tc
Copyright 1996
Lawrence C. Marsh
5.18
probability statements
P( t < -tc ) = P( t > tc ) =
P(-tc t tc) = 1
P(-tc
(b2 2)
se(b2)
t c) = 1
Copyright 1996
Lawrence C. Marsh
Confidence Intervals
Two-sided (1)x100% C.I. for 1:
b1 t/2[se(b1)], b1 + t/2[se(b1)]
Two-sided (1)x100% C.I. for 2:
b2 t/2[se(b2)], b2 + t/2[se(b2)]
5.19
Copyright 1996
Lawrence C. Marsh
5.20
Copyright 1996
Lawrence C. Marsh
Hypothesis Tests
1. A null hypothesis, H0.
2. An alternative hypothesis, H 1.
3. A test statistic.
4. A rejection region.
5.21
Copyright 1996
Lawrence C. Marsh
Rejection Rules
5.22
1. Two-Sided Test:
If the value of the test statistic falls in the critical region in either tail of the
t-distribution, then we reject the null hypothesis in favor of the alternative.
2. Left-Tail Test:
If the value of the test statistic falls in the critical region which lies in the
left tail of the t-distribution, then we reject the null hypothesis in favor of
the alternative.
2. Right-Tail Test:
If the value of the test statistic falls in the critical region which lies in the
right tail of the t-distribution, then we reject the null hypothesis in favor of
the alternative.
Copyright 1996
Lawrence C. Marsh
5.23
Copyright 1996
Lawrence C. Marsh
5.24
Copyright 1996
Lawrence C. Marsh
5.25
Type I error:
We make the mistake of rejecting the null
hypothesis when it is true.
= P(rejecting H0 when it is true).
Type II error:
We make the mistake of failing to reject the null
hypothesis when it is false.
= P(failing to reject H0 when it is false).
Copyright 1996
Lawrence C. Marsh
5.26
Prediction Intervals
y^ o tc se( f )
^
f = yo yo
se( f ) =
^
var( f )
1
o
^
^
2
var( f ) = 1 +
+
x t x2
Chapter 6
Copyright 1996
Lawrence C. Marsh
6.1
Copyright 1996
Lawrence C. Marsh
Explaining Variation in yt
6.2
yt = 1 + et
T
(yt b1) = 0
t=1
et = t= 1(yt 1)
t=1
T
et
= (yt b1) = 0
1
t=1
t=1
yt Tb1 = 0
t=1
b1 = y
Why not y?
Copyright 1996
Lawrence C. Marsh
Explaining Variation in yt
6.3
^
y t = b 1 + b2x t + e t
^
Explained variation: yt = b1 + b2xt
Unexplained variation:
^e = y ^y = y b b x
t
t
t
t
1
2 t
Copyright 1996
Lawrence C. Marsh
Explaining Variation in yt
^
^
y t = y t + et
6.4
using y as baseline
^
^
yt y = yt y + et Why not y?
T
cross
^2 product
t=1
term
t
drops
out
T
(yty) = t= 1(yty) +e
t=1
Copyright 1996
Lawrence C. Marsh
Total Variation in yt
SST
(yt y)
t=1
6.5
Copyright 1996
Lawrence C. Marsh
Explained Variation in yt
^y = b + b x
t
1
2 t
^
Fitted yt values:
SSR
(yt y)
t=1
6.6
Copyright 1996
Lawrence C. Marsh
Unexplained Variation in yt
SSE = error sum of squares
^
^
et = ytyt = yt b1 b2xt
^
SSE
(yt yt) = et
t=1
t=1
6.7
Copyright 1996
Lawrence C. Marsh
6.8
Copyright 1996
Lawrence C. Marsh
Coefficient of Determination
What proportion of the variation
in yt is explained?
2
0 R 1
2
R =
SSR
SST
6.9
Copyright 1996
Lawrence C. Marsh
Coefficient of Determination
SST = SSR + SSE
SST
SST
Dividing
by SST
SSR SSE
+
SST SST
1 =
2
R =
SSR
SST
SSR + SSE
SST SST
= 1
SSE
SST
6.10
Copyright 1996
Lawrence C. Marsh
Coefficient of Determination
6.11
Copyright 1996
Lawrence C. Marsh
Correlation Analysis
Population:
Sample:
r=
6.12
cov(X,Y)
var(X) var(Y)
^
cov(X,Y)
^
var(X)
^
var(Y)
Copyright 1996
Lawrence C. Marsh
Correlation Analysis
6.13
^ = t=1
var(X)
^ =
var(Y)
^
cov(X,Y)
=
(xt x) /(T1)
T
(yt y) /(T1)
t=1
t=1
Copyright 1996
Lawrence C. Marsh
6.14
Correlation Analysis
r=
(xt x)(yt y)
t=1
(xt x) (yt y)
t=1
t=1
Copyright 1996
Lawrence C. Marsh
6.15
r = R
Copyright 1996
Lawrence C. Marsh
6.16
Copyright 1996
Lawrence C. Marsh
b2 = 0.1283
se(b1) =
^ 1)
var(b
490.12
se(b2) =
^ 2)
var(b
0.0009326 = 0.0305
t =
t =
40.7676
22.1287
b2
=
se(b2)
0.1283
0.0305
b1
se(b1)
= 22.1287
=
=
1.84
4.20
6.17
Copyright 1996
Lawrence C. Marsh
6.18
Copyright 1996
Lawrence C. Marsh
R =
SSR
SST
= 1
SSE
= 0.317
SST
6.19
Copyright 1996
Lawrence C. Marsh
6.20
yt = 40.7676 + 0.1283xt
(s.e.) (22.1387) (0.0305)
yt = 40.7676 + 0.1283xt
(t) (1.84) (4.20)
Copyright 1996
Lawrence C. Marsh
6.21
R = 0.317
2
Copyright 1996
Lawrence C. Marsh
6.22
yt = 1 + 2xt + et
yt = 1 + (c2)(xt/c) + et
yt = 1 +
* *
x+
2 t
et
where
*
=
2
c2
and
x*t = xt/c
Copyright 1996
Lawrence C. Marsh
6.23
*
*
*
y = +x
t
where y*t =
*1 = 1/c
*
+
e
t
t
yt/c
e*t = et/c
and
*2 = 2/c
Copyright 1996
Lawrence C. Marsh
6.24
2x*t + e*t
where y*t =
*1 = 1/c
yt/c
and
e*t = et/c
x*t = xt/c
Copyright 1996
Lawrence C. Marsh
Functional Forms
6.25
Copyright 1996
Lawrence C. Marsh
6.27
yt = 1 + 2xt + et
yt = 1 + 2 ln(xt) + et
ln(yt) = 1 + 2xt + et
yt = 1 + 2x2t + et
yt = 1 + 2xt + et
yt = 1 + 2xt + et
yt = 1 + 2xt + exp(3xt) + et
Copyright 1996
Lawrence C. Marsh
6.27
nonlinear
relationship
between food
expenditure and
income
food
expenditure
income
Copyright 1996
Lawrence C. Marsh
1.
2.
3.
4.
5.
6.
Linear
Reciprocal
Log-Log
Log-Linear
Linear-Log
Log-Inverse
6.28
Copyright 1996
Lawrence C. Marsh
6.29
Linear
yt = 1 + 2xt + et
slope: 2
xt
elasticity: 2 y
t
Copyright 1996
Lawrence C. Marsh
6.30
Reciprocal
yt = 1 + 2 x + et
1
slope:
1
2 2
xt
t
elasticity:
1
2 x y
t
Copyright 1996
Lawrence C. Marsh
Log-Log
ln(yt)= 1 + 2ln(xt) + et
yt
slope: 2 x
t
elasticity: 2
6.31
Copyright 1996
Lawrence C. Marsh
Log-Linear
ln(yt)= 1 + 2xt + et
slope: 2 yt
elasticity: 2xt
6.32
Copyright 1996
Lawrence C. Marsh
6.33
Linear-Log
yt= 1 + 2ln(xt) + et
slope:
1
_
xt
elasticity:
1
_
yt
Copyright 1996
Lawrence C. Marsh
Log-Inverse
1
ln(yt) = 1 - 2x + et
t
yt
slope: 2 2
xt
1
elasticity: 2 x
t
6.34
Copyright 1996
Lawrence C. Marsh
1. E (et) = 0
2. var (et) =
3. cov(ei, ej) = 0
4. et ~ N(0,
6.35
Copyright 1996
Lawrence C. Marsh
Economic Models
1.
2.
3.
4.
5.
Demand Models
Supply Models
Production Functions
Cost Functions
Phillips Curve
6.36
Copyright 1996
Lawrence C. Marsh
Economic Models
6.37
1. Demand Models
* quality demanded (yd) and price (x)
* constant elasticity
ln(yt )= 1 + 2ln(x)t + et
d
Copyright 1996
Lawrence C. Marsh
Economic Models
6.38
2. Supply Models
* quality supplied (y ) and price (x)
* constant elasticity
s
ln(y )= 1 + 2ln(xt) + et
s
t
Copyright 1996
Lawrence C. Marsh
Economic Models
6.39
3. Production Functions
* output (y) and input (x)
* constant elasticity
ln(yt)= 1 + 2ln(xt) + et
Copyright 1996
Lawrence C. Marsh
Economic Models
yt = 1 + 2x t + et
2
6.40
Copyright 1996
Lawrence C. Marsh
Economic Models
6.41
Copyright 1996
Lawrence C. Marsh
Economic Models
6.42
5. Phillips Curve
nonlinear in both variables and parameters
wt wt-1
1
% wt = w
= u
t-1
t
unemployment rate, ut
Chapter 7
Copyright 1996
Lawrence C. Marsh
7.1
The Multiple
Regression Model
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
yt
= 2
xt2
yt
= 3
xt3
7.2
Copyright 1996
Lawrence C. Marsh
Correlated Variables
yt = 1 + 2xt2 + 3xt3 + et
yt = output
xt2 = capital
xt3 = labor
7.3
Copyright 1996
Lawrence C. Marsh
7.4
Copyright 1996
Lawrence C. Marsh
Statistical Properties of et
1. E(et) = 0
2. var(et) = 2
7.5
Copyright 1996
Lawrence C. Marsh
Statistical Properties of yt
7.6
ts
Copyright 1996
Lawrence C. Marsh
Assumptions
7.7
1. yt = 1 + 2xt2 +. . .+ KxtK + et
2. E (yt) = 1 + 2xt2 +. . .+ KxtK
3. var(yt) = var(et) =
ts
Copyright 1996
Lawrence C. Marsh
S S(1, 2, 3) = yt12xt23xt3
t=1
Define:
y*t = yt y
x*t2 = xt2 x2
x*t3 = xt3 x3
7.8
Copyright 1996
Lawrence C. Marsh
* *
**
* *
* *
* *
**
* *
b3 = yt xt32xt2 yt xt2xt3xt2
2
2
*
* *
7.9
Copyright 1996
Lawrence C. Marsh
Dangers of Extrapolation
7.10
Copyright 1996
Lawrence C. Marsh
7.11
^
=
^
e
^ 2
Copyright 1996
Lawrence C. Marsh
Gauss-Markov Theorem
7.12
Copyright 1996
Variances
Lawrence C. Marsh
7.13
yt = 1 + 2xt2 + 3xt3 + et
When r23 = 0
these reduce
to the simple
regression
formulas.
var(b2) =
(1 r23)(xt2 x2)
var(b3) =
(1 r23)(xt3 x3)
where r23 =
Copyright 1996
Lawrence C. Marsh
7.14
Variance Decomposition
0.
T
t=
1(xt2 x2)
(xt2 x2) .
2
0.
Copyright 1996
Lawrence C. Marsh
7.15
Covariances
yt = 1 + 2xt2 + 3xt3 + et
cov(b2,b3) =
r23 2
2
Copyright 1996
Lawrence C. Marsh
Covariance Decomposition
7.16
Copyright 1996
Lawrence C. Marsh
Var-Cov Matrix
7.17
yt = 1 + 2xt2 + 3xt3 + et
The least squares estimators b1, b2, and b3
have covariance matrix:
var(b1) cov(b1,b2) cov(b1,b3)
cov(b1,b2,b3) = cov(b1,b2) var(b2)
cov(b2,b3)
cov(b1,b3) cov(b2,b3) var(b3)
Copyright 1996
Normal
Lawrence C. Marsh
7.18
)
This implies and is implied by: t
Since bk is a linear
function of the yts:
bk ~ N k, var(bk)
bk k
z =
~ N(0,1)
var(bk)
for k = 1,2,...,K
Copyright 1996
Student-t
Lawrence C. Marsh
7.19
instead
of
.
k
t =
bk k
^ k)
var(b
bk k
=
se(bk)
Copyright 1996
Lawrence C. Marsh
Interval Estimation
bk k
P tc
se(bk)
tc
7.20
tc = 1
tc) = /2.
P bk tc se(bk) k bk + tc se(bk) = 1
Interval endpoints:
bk tc se(bk) , bk + tc se(bk)
Chapter 8
Copyright 1996
Lawrence C. Marsh
8.1
Hypothesis Testing
and
Nonsample Information
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
Chapter 8: Overview
1.
2.
3.
4.
5.
6.
7.
Student-t Tests
Goodness-of-Fit
F-Tests
ANOVA Table
Nonsample Information
Collinearity
Prediction
8.2
Copyright 1996
Lawrence C. Marsh
Student - t Test
8.3
H0: 1 = 0
H0: 2 + 3 + 4 = 1
H0: 32 73 = 21
H0: 2 3 5
Copyright 1996
Lawrence C. Marsh
8.4
b3
~ t (TK)
t=
se(b3)
df = TK
= T4
tc
Copyright 1996
Lawrence C. Marsh
8.5
b2
~ t (TK)
t=
se(b2)
df = TK
= T4
-tc
tc
Copyright 1996
Lawrence C. Marsh
Goodness - of - Fit
Coefficient of Determination
T
SSR
=
SST
R =
2
0 R 1
T (yt y)
t=1
(yt y)
t=1
8.6
Copyright 1996
Lawrence C. Marsh
Adjusted R-Squared
8.7
Original:
2
R =
SSR
SST
= 1
SSE
SST
Adjusted:
R = 1
2
SSE/(TK)
SST/(T1)
Copyright 1996
Lawrence C. Marsh
Computer Output
b2
6.642
t=
=
=
se(b2)
3.191
2.081
8.8
Copyright 1996
Lawrence C. Marsh
8.9
^y = X + X
t
t2
t3
(6.48)
(3.191)
(0.167)
(s.e.)
Reporting t-statistics:
^y = X + X
t
t2
t3
(16.17)
(-2.081)
(17.868)
(t)
Copyright 1996
Lawrence C. Marsh
8.10
(1964.758 1805.168)/1
1805.168/(52 3)
= 4.33
H0: 2 = 0
H1: 2 0
dfn = J = 1
dfd = TK = 49
Copyright 1996
Lawrence C. Marsh
8.11
(SSER SSEU)/J
F =
SSEU/(TK)
dfn = J = 2
First run the restricted
regression by dropping
dfd = TK = 49
Xt2 and Xt4 to get SSER.
Next run unrestricted regression to get SSEU .
Copyright 1996
F-Tests
Lawrence C. Marsh
8.12
Fc
Copyright 1996
Lawrence C. Marsh
8.13
yt = 1 + 2Xt2 + 3Xt3 + et
We ignore 1. Why?
(SSER SSEU)/J
F =
SSEU/(TK)
H0: 2 = 3 = 0
H1: H0 not true
dfn = J = 2
(13581.35 1805.168)/2
dfd = TK = 49
=
1805.168/(52 3)
= 0.05
= 159.828
Reject H ! Fc = 3.187
0
Copyright 1996
Lawrence C. Marsh
8.14
ANOVA Table
R =
SSR
=
SST
11776.18
13581.35
0.867
Copyright 1996
Lawrence C. Marsh
Nonsample Information
8.15
Copyright 1996
Lawrence C. Marsh
Collinear Variables
8.16
Copyright 1996
Lawrence C. Marsh
Effects of Collinearity
8.17
Copyright 1996
Lawrence C. Marsh
Identifying Collinearity
8.18
Copyright 1996
Lawrence C. Marsh
Mitigating Collinearity
8.19
Copyright 1996
Prediction
Lawrence C. Marsh
8.20
yt = 1 + 2Xt2 + 3Xt3 + et
Given a set of values for the explanatory
variables, (1 X02 X03), the best linear
unbiased predictor of y is given by:
^y = b + b X + b X
0
02
03
Chapter 9
Copyright 1996
Lawrence C. Marsh
9.1
Extensions
of the Multiple
Regression Model
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
9.2
Copyright 1996
Lawrence C. Marsh
yt = 1 + 2Xt + 3Dt + et
yt = speed of car in miles per hour
Xt = age of car in years
Dt = 1 if red car, Dt = 0 otherwise.
Police: red cars travel faster.
H0: 3 = 0
H1: 3 > 0
9.3
Copyright 1996
Lawrence C. Marsh
yt = 1 + 2Xt + 3Dt + et
yt
1 + 3
miles
per
hour
red c
ar
othe
r
c a rs
age in years
Xt
9.4
Copyright 1996
Lawrence C. Marsh
9.5
yt = 1 + 2Xt + 3DtXt + et
Stock portfolio: Dt = 1 Bond portfolio: Dt = 0
yt
yt = 1 + (2 + 3)Xt + et
value
of
porfolio
1
1 = initial
investment
stocks
2 + 3
bonds
yt = 1 + 2Xt + et
years
Xt
Copyright 1996
Lawrence C. Marsh
9.6
yt
1 + 3
1
regular seed: Dt = 0
yt = (1 + 3) + (2 + 4)Xt + et
miracle
2 + 4
yt = 1 + 2Xt + et
2
regular
rainfall
Xt
Copyright 1996
Lawrence C. Marsh
yt = 1 + 2 Xt + 3 Dt + et
For men Dt = 1.
For women Dt = 0.
yt
yt = (1+ 3) + 2 Xt + et
wage
rate
1 + 3
1
Men
Women
.
.
0
yt = 1 + 2 Xt + et
2
Testing for
discrimination
in starting wage
years of experience
H0: 3 = 0
H1: 3 > 0
Xt
9.7
Copyright 1996
Lawrence C. Marsh
yt = 1 + 5 Xt + 6 Dt Xt + et
9.8
For men Dt = 1.
For women Dt = 0.
yt
yt = 1 + (5 +6 )Xt + et
wage
rate
5 +6
Men
Women
yt = 1 + 5 Xt + et
5
years of experience
Xt
Copyright 1996
Lawrence C. Marsh
9.9
yt = 1 + 2 Xt + 3 Dt + 4 Dt Xt + et
women are started
at a higher wage.
yt
wage
rate
yt = (1 + 3) + (2 + 4) Xt + et
Men
1
1 + 3
Note:
( 3 < 0 )
+ 4
Women
yt = 1 + 2 Xt + et
years of experience
Xt
Copyright 1996
Lawrence C. Marsh
9.10
Copyright 1996
Lawrence C. Marsh
9.11
men: Dt = 1 ; women: Dt = 0
Yt 1 2 Xt 3 Dt 4 Dt Xt et
H0: vs1:
Testing for
discrimination in
starting wage.
3
Est. Var b3
H0: vs1:
Testing for
discrimination in
wage increases.
intercept
b4
Est. Var b4
tn4
slope
tn4
Copyright 1996
Lawrence C. Marsh
9.12
Ho:
H1 : otherwise
Testing
SSE R SSE U 2
SSE U T 4
T 4
SSE U yt b1bXt b Dt b Dt Xt
t1
intercept and slope
and
SSE R
yt b 1 b 2 X t
Copyright 1996
Lawrence C. Marsh
9.13
women: Dt = 0
yt = 1 + 2 Xt + 3 Dt + 4 Dt Xt + et
Ho: 3 = 4 = 0
yt = wage rate
Xt = years of experience
Copyright 1996
Lawrence C. Marsh
9.14
II. Allowing for unequal variances:
(running three regressions)
Forcing men and women to have same 1, 2.
Everyone: yt = 1 + 2 Xt + et
SSER
Copyright 1996
Lawrence C. Marsh
9.15
Interaction Variables
1. Interaction Dummies
2. Polynomial Terms
(special case of continuous interaction)
3. Interaction Among Continuous Variables
Copyright 1996
Lawrence C. Marsh
1. Interaction Dummies
9.16
Copyright 1996
Lawrence C. Marsh
9.17
2. Polynomial Terms
Polynomial Regression
yt = income; Xt = age
yt
20
30
40
50
60
70
80
90
Xt
Copyright 1996
Lawrence C. Marsh
Polynomial Regression
yt = income; Xt = age
yt = 1 + 2 X t + 3 X + 4 X + et
2
t
3
t
9.18
Copyright 1996
Lawrence C. Marsh
3. Continuous Interaction
9.19
Copyright 1996
Lawrence C. Marsh
continuous interaction
9.20
yt
= 2 + 4 Zt
Bt
yt
Your mind sorts
= 2 + 4 Bt
things out while
Zt
you sleep (when you have things to sort out.)
Copyright 1996
Lawrence C. Marsh
9.21
yt = 1 + 2 Zt + 3 Z2t + et
Sleep needed to maximize your exam grade:
2
yt
= 2 + 23 Zt = 0
Zt =
Zt
3
where > 0 and < 0
2
Copyright 1996
Lawrence C. Marsh
9.22
Copyright 1996
Lawrence C. Marsh
1 quits job
0 does not quit
9.23
Copyright 1996
Lawrence C. Marsh
9.24
Xi2
Copyright 1996
Lawrence C. Marsh
9.25
Copyright 1996
Lawrence C. Marsh
Probit Model
latent variable, zi :
zi = 1 + 2 Xi2 +
9.26
1
2
0.5z 2
i
zi
1
0.5u2
du
2 e
Copyright 1996
Lawrence C. Marsh
9.27
Probit Model
Since zi = 1 + 2 Xi2 + , we can
substitute in to get
Xi2
Copyright 1996
Lawrence C. Marsh
9.28
Logit Model
pi
pi =
Define pi :
1
1
+ e (
+ X +
2
i2
For 2 > 0,
pi
For 2 > 0,
pi
Copyright 1996
Lawrence C. Marsh
Logit Model
pi
9.29
pi =
1
1
+e
( + X +
1
2 i2
yt = 1
yt = 0
total hours of work each week
Xi2
Copyright 1996
Lawrence C. Marsh
Maximum Likelihood
9.30
Chapter 10
Copyright 1996
Lawrence C. Marsh
10.1
Heteroskedasticity
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
10.2
Copyright 1996
Lawrence C. Marsh
Regression Model
10.3
yt = 1 + 2xt + et
zero mean:
E(et) = 0
homoskedasticity:
var(et) = 2
nonautocorrelation:
cov(et, es) = t s
heteroskedasticity:
var(et) = t2
Copyright 1996
Lawrence C. Marsh
yt
.
.
.
.
.
. .
.
.
.
.
. ...
.
.
... .. .
.
.. . .
.
. .. . . . .
..
. .
.
income
xt
10.4
Copyright 1996
Lawrence C. Marsh
10.5
.
x1
x2
n
o
ti
yt
x3
x4
income
xt
Copyright 1996
Lawrence C. Marsh
consumption
yt
.
.
. . .
. .
. .
.
.
. .
.
.
.
.
. .. . .
.
.
.
.
.
.
.
.
.
. .
.
.
.
. . . .
.
. .
income
xt
10.6
Copyright 1996
Lawrence C. Marsh
f(yt)
10.7
s
n
co
p
m
u
n
o
ti
rich people
poor people
x1
x2
x3
income
xt
Copyright 1996
Lawrence C. Marsh
10.8
errors for
Copyright 1996
Lawrence C. Marsh
yt = 1 + 2xt + et
var(et) = t2
heteroskedasticity:
2
xt x
x
x
var(b2) =
t
t
x
x
t
10.9
Copyright 1996
Lawrence C. Marsh
10.10
^
2
x
x
est.var(b2) = t t
xt x
In large samples Whites standard error
(square root of estimated variance) is a
correct / accurate / consistent measure.
Copyright 1996
Lawrence C. Marsh
10.11
Copyright 1996
Lawrence C. Marsh
10.12
Proportional Heteroskedasticity
yt = 1 + 2xt + et
E(et) = 0
var(et) = t2
where t2 = 2 xt
cov(et, es) = 0
ts
The variance is
assumed to be
proportional to
the value of xt
Copyright 1996
Lawrence C. Marsh
std.dev. proportional to
xt
10.13
yt = 1 + 2xt + et
variance: var(et) = t2
standard deviation:
t2 = 2 xt
t = xt
yt
1
xt
et
= 1
+ 2
+
xt
xt
xt
xt
xt
Copyright 1996
Lawrence C. Marsh
yt
1
xt
et
= 1
+ 2
+
xt
xt
xt
xt
10.14
var(e*t ) = 2
Copyright 1996
Lawrence C. Marsh
10.15
Copyright 1996
Lawrence C. Marsh
Partitioned Heteroskedasticity
10.16
yt = 1 + 2xt + et
t = 1, ,100
yt = bushels per acre of corn
xt = gallons of water per acre (rain or other)
...
Copyright 1996
Lawrence C. Marsh
1
1
1
xt
et
+ 2 +
1
1
1
2
2
xt
et
+ 2 +
2
2
10.17
var(et) = 12
t = 1, . . . ,80
var(et) = 22
t = 81, . . . ,100
Copyright 1996
Lawrence C. Marsh
10.18
1
1
the 80 observations on field corn.
2
2
the 20 observations on sweet corn.
Copyright 1996
Lawrence C. Marsh
10.19
Detecting Heteroskedasticity
Determine existence and nature of heteroskedasticity :
Copyright 1996
Lawrence C. Marsh
Residual Plots
10.20
.
.
. . .
.
.
.
.
.
. .
. .
. . . .. . . . . ..
.
.
. . ..
.
.
.
.
.
.
.
.
xt
..
.
.
.
Copyright 1996
Lawrence C. Marsh
Goldfeld-Quandt Test
10.21
Copyright 1996
Lawrence C. Marsh
10.22
In the proportional case, drop the middle
r observations where r T/6, then run
separate least squares regressions on the first
T1 observations and the last T2 observations.
Ho: 12 = 22
H1: 12 > 22
Goldfeld-Quandt
Test Statistic
GQ =
Use F
Table
~ F[T1-K1, T2-K2]
Copyright 1996
Lawrence C. Marsh
10.23
Copyright 1996
Lawrence C. Marsh
10.24
lne^t2 = +1zt1+2zt2 + t
Chapter 11
Copyright 1996
Lawrence C. Marsh
11.1
Autocorrelation
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
11.2
Copyright 1996
Postive
Auto.
No
Auto.
et
0
et
0
et
Negative
Auto.
Lawrence C. Marsh
11.3
.
.
. . ..
.. . .
.
. .
...
. ..
.
..
. . .. . . . . .
. . .
.
.
..
.
. .
..
.
.
.
.
. .t
.
too much (repelling)
. . crosses line
.
. . .
.
.
.
.
. . . t
.
.
.
.
.
Copyright 1996
Lawrence C. Marsh
Regression Model
11.4
yt = 1 + 2xt + et
zero mean:
E(et) = 0
homoskedasticity:
var(et) = 2
nonautocorrelation:
cov(et, es) = t s
autocorrelation:
cov(et, es) t s
Copyright 1996
Lawrence C. Marsh
Order of Autocorrelation
11.5
yt = 1 + 2xt + et
1st Order: et = et1 + t
2nd Order: et = 1et1 + 2et2 + t
3rd Order: et = 1et1 + 2et2 + 3et3 + t
We will assume First Order Autocorrelation:
AR(1) :
et = et1 + t
Copyright 1996
Lawrence C. Marsh
11.6
yt = 1 + 2xt + et
et = et1 + t
E(t) = 0 var(t) = 2
cov(t, s) = t s
E(et) = 0
2
var(et) = e2 =
12
k>0
k>0
Copyright 1996
Lawrence C. Marsh
11.7
Copyright 1996
Lawrence C. Marsh
et = et1 + t
yt = 1 + 2xt + et
substitute
in for et
yt = 1 + 2xt + et1 + t
Now we need to get rid of et1
(continued)
11.8
Copyright 1996
Lawrence C. Marsh
11.9
yt = 1 + 2xt + et1 + t
yt = 1 + 2xt + et
et = yt 12xt
et1 = yt1 12xt1
lag the
errors
once
(continued)
Copyright 1996
Lawrence C. Marsh
11.10
x*t2 = (xtxt1)
*
1 = 1(1)
yt = yt yt1
*
x*t2 = xt xt1
Copyright
1996
*
Lawrence C. Marsh
1 = 1(1)
11.11
y*t = *1 + 2x*t2 + t
Copyright 1996
Lawrence C. Marsh
11.12
Copyright 1996
Lawrence C. Marsh
11.13
y1 = 1 + 2x1 + e1
with error variance: var(e1) = e2 = 2 /(1-2).
We could include this as the 1st observation for our
estimation procedure but we must first transform it so
that it has the same error variance as the other observations.
Note: The other observations all have error variance 2.
Copyright 1996
Lawrence C. Marsh
y1 = 1 + 2x1 + e1
11.14
Copyright 1996
Lawrence C. Marsh
11.15
y1 = 1 + 2x1 + e1
Multiply through by 1-2 to get:
1-2
y1 =
1-2
1 +
1-2
2x1 +
1-2
e1
Copyright 1996
Lawrence C. Marsh
11.16
et = et1 + t
First, use least squares to estimate the model:
yt = 1 + 2xt + et
The residuals from this estimation are:
^e = y - b - b x
t
t
1
2 t
Copyright 1996
Lawrence C. Marsh
^e = y - b - b x
t
t
1
2 t
11.17
^e = e^ + ^
t
t1
t
The least squares solution is:
^^
t=2e e
^= t t-1
T
^2
t=2e
t-1
T
Copyright 1996
Lawrence C. Marsh
Durbin-Watson Test
Ho: = 0
11.18
^ ^ 2
t = 2 e e
d= Tt t-1
^2
t=1 e
t
T
Copyright 1996
Lawrence C. Marsh
11.19
^
d 2(1)
When ^ = 0 , the Durbin-Watson statistic is d 2.
When ^ = 1 , the Durbin-Watson statistic is d 0.
Tables for critical values for d are not always
readily available so it is easier to use the p-value
that most computer programs provide for d.
Reject Ho if p-value < , the significance level.
Copyright 1996
Lawrence C. Marsh
11.20
^y = ^ + ^ x + ^e~
T+1
1
2 T+1
T
^
^
where 1 and 2 are generalized least squares
~
estimates and eT is given by:
~
^
^
e =y x
T
2 T
Copyright 1996
Lawrence C. Marsh
11.21
Chapter 12
Copyright 1996
Lawrence C. Marsh
12.1
Pooling
Time-Series and
Cross-Sectional Data
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
12.2
Copyright 1996
Lawrence C. Marsh
12.3
1it = 1i
2it = 2i
3it = 3i
Copyright 1996
Lawrence C. Marsh
12.4
t = 1, . . . , 20
Copyright 1996
Lawrence C. Marsh
12.5
var(eWt) = W
cov(eGt, eGs) = 0
cov(eWt, eWs) = 0
cov(eGt, eWs) = 0
Copyright 1996
Lawrence C. Marsh
12.6
homoskedasticity assumption:
2
G = W
2
3W = 3G + 3
2W = 2G + 2
Copyright 1996
Lawrence C. Marsh
12.7
Copyright 1996
Lawrence C. Marsh
12.8
Copyright 1996
Lawrence C. Marsh
12.9
Copyright 1996
Lawrence C. Marsh
cov(eGt, eWt) = GW
12.10
Copyright 1996
Lawrence C. Marsh
12.11
Copyright 1996
Lawrence C. Marsh
12.12
Copyright 1996
Lawrence C. Marsh
12.13
GW
2
rGW
2
^
GW
^2 ^2
G W
=Tr
2
GW
(1)
asy.
Copyright 1996
Start with
the residuals
^eGt and ^eWt
from each
equation
estimated
separately.
rGW
Lawrence C. Marsh
12.14
1
GW
T
e^Gte^Wt
2
^
2
^
e
2
^
2
^
e
1
G
T
1
W
T
2
^
GW
^2 ^2
G W
=Tr
2
GW
Gt
Wt
(1)
asy.
Copyright 1996
Lawrence C. Marsh
12.15
1it = 1i
2it = 2
3it = 3
Copyright 1996
Lawrence C. Marsh
D2i=1 if East
D2i=0 if not E
D3i=1 if South
D3i=0 if not S
12.16
D4i=1 if West
D4i=0 if not W
Copyright 1996
Lawrence C. Marsh
12.17
Ho : 11 = 12 = 13 = 14
H1 : Ho not true
The Ho joint null hypothesis may be tested with F-statistic:
F=
(SSER SSEU) / J
SSEU / (NT K)
~ F(NT K)
Copyright 1996
Lawrence C. Marsh
12.18
Copyright 1996
Lawrence C. Marsh
12.19
where i = 1, ... ,N
var(i) = 2
E(1i) = 1
var(1i) = 2
Copyright 1996
Lawrence C. Marsh
12.20
Copyright 1996
Lawrence C. Marsh
12.21
E(it) = 0
var(it) =+ e
2
cov(it,is) =
ts
cov(it,js) =
ij
Chapter 13
Copyright 1996
Lawrence C. Marsh
13.1
Simultaneous
Equations
Models
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
13.2
Copyright 1996
Lawrence C. Marsh
c = 1 + 2 y
income is either consumed or invested:
y=c+i
13.3
Copyright 1996
Lawrence C. Marsh
ct = 1 + 2 yt + et
The income identity:
yt = ct + it
13.4
Copyright 1996
Lawrence C. Marsh
13.5
1.
ct = 1 + 2 yt + et
5.
3.
4.
yt = ct + it
Since yt
contains et
they are
correlated
Copyright 1996
Lawrence C. Marsh
13.6
Copyright 1996
Lawrence C. Marsh
13.7
Simultaneous Equations:
yt
ct
et
yt
et
ct
it
Copyright 1996
Lawrence C. Marsh
yt = ct + it
ct = 1 + 2(ct + it) + et
(1 2)ct = 1 + 2 it + et
13.8
Copyright 1996
Lawrence C. Marsh
13.9
(1 2)ct = 1 + 2 it + et
2
1
1
ct =
+
it +
et
(12)
(12) (12)
ct = 11 + 21 it + t
The Reduced Form Equation
Copyright 1996
Lawrence C. Marsh
13.10
ct = 11 + 21 it + t
11 =
1
(12)
and
t =
21 =
1
(12)
2
(12)
+ et
Copyright 1996
Lawrence C. Marsh
yt = ct + it
where ct = 11 + 21 it + t
yt = 11 + (1+21) it + t
It is sometimes useful to give this equation
its own reduced form parameters as follows:
yt = 12 + 22 it + t
13.11
Copyright 1996
Lawrence C. Marsh
ct = 11 + 21 it + t
yt = 12 + 22 it + t
13.12
11 = 12 =
1
(12)
22 = (121) =
1
(12)
Copyright 1996
Lawrence C. Marsh
13.13
Identification
The structural parameters are
1 and 2.
11 and 21.
11
1 =
^
(1 21)
^
21
2 =
^
(1 21)
^
Copyright 1996
Lawrence C. Marsh
Identification
13.14
Copyright 1996
Lawrence C. Marsh
13.15
A system of M equations
containing M endogenous
variables must exclude at least
M1 variables from a given
equation in order for the
parameters of that equation to
be identified and to be able to
be consistently estimated.
Copyright 1996
Lawrence C. Marsh
13.16
Copyright 1996
2SLS: Stage I
Lawrence C. Marsh
13.18
^y = ^ + ^ x + ^ x
t1
11
21 t1
31 t2
^yt2 = ^12 + ^22 xt1 + ^32 xt2
yt1 = ^
yt1 + ^t1
Copyright 1996
Lawrence C. Marsh
2SLS: Stage II
yt1 = ^
yt1 + ^t1
Substitue in
for yt1 , yt2
and
13.19
Copyright 1996
Lawrence C. Marsh
13.20
and
1 , 2 , 3 , 1 , 2 and 3
Chapter 14
Copyright 1996
Lawrence C. Marsh
14.1
Nonlinear
Least
Squares
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh
14.2
SSE = 2 (y ^) = 0
t
y t = + et
yt ^ = 0
et = y t
2
e2t = (yt )
2
SSE = (yt )
^ = 0
yt
Yields an exact analytical solution:
^ =
yt = y
Copyright 1996
Lawrence C. Marsh
14.3
^ )= 0
SSE = 2 x (y x
t t
t
2
^
xtyt xt = 0
yt = xt + et
et = yt xt
2
et = (yt xt)
xt yt ^x2t = 0
^x2 = x y
t
t t
This yields an exact
analytical solution:
^= xtyt
x2t
Copyright 1996
Lawrence C. Marsh
14.4
yt = + xt + et
^ )=0
SSE = 2 x (y ^ x
t t
t
^ )=0
SSE = 2 (y ^ x
t
t
^
^
y x = 0
^
2
^
xtyt xt xt = 0
^
^ = y x
^= (xtx)(yty)
2
(xtx)
Copyright 1996
Lawrence C. Marsh
14.5
yt = xt + et
SSE = (yt xt )
PROBLEM: An exact
analytical solution to
this does not exist.
^
xt ln(xt)yt] xt ln(xt)] = 0
Copyright 1996
Lawrence C. Marsh
14.6
SSE = (yt x )
Copyright 1996
Lawrence C. Marsh
Conclusion
The least squares principle
is still appropriate when the
model is nonlinear, but it is
harder to find the solution.
14.7
Copyright 1996
Lawrence C. Marsh
Optional Appendix
Nonlinear least squares
optimization methods:
14.8
Copyright 1996
Lawrence C. Marsh
14.9
Copyright 1996
Lawrence C. Marsh
for
14.10
t = 1, . . . , n.
b = b(o) as follows:
(b - b(o))Tf(Xt,b )(b - b ) + Rt +
f(Xt,b )b - f(Xt,b ) b + t
yt - f(Xt,b ) + f(Xt,b ) b =
yt =
f(Xt,b )b + t
f(Xt,b )b + t
This is linear in b .
f(Xt,b)b + t or
for t = 1, . . . , n
^b
y =
f(X,b )b +
in matrix terms
(i.e. y = y* and X =
f(X,b )
Copyright 1996
Lawrence C. Marsh
14.13
y =
f(X,b ) b
^b [ f(X,b
to get:
^b
= b(o) +
-1
T
T
)
f(X,b
)
]
f(X,b
)
Copyright 1996
^b
= b(o) +
Lawrence C. Marsh
14.14
Copyright 1996
Lawrence C. Marsh
14.15
Copyright 1996
Lawrence C. Marsh
14.16
t-1
for t = 1, . . . , n.
E u t = 0 , E u 2t = su2, E u t u s = 0 for s t.
Therefore, u t is nonautocorrelated and homoskedastic.
Durbins Method is to set aside a copy of the equation,
lag it once, multiply by and subtract the new equation
from the original equation, then move the yt-1 term to
the right side and estimate along with the bs by OLS.
Copyright 1996
Lawrence C. Marsh
14.17
for t = 1, . . . , n.
where
= t - 1 + ut
Copyright 1996
Lawrence C. Marsh
14.18
1 = b1 -
2 = b2
3= - b2
4 = b3 5= - b3 6=
Copyright 1996
1 = b1 -
2 = b2
3= - b2
Lawrence C. Marsh
14.19
4 = b3 5= - b3 6=
^ ^ ^ ^ ^ ^
Given OLS estimates: 1 2 3 4 5 6
we can get three separate and distinct estimates for
3
^
^2
^5
^4
^
^
6
Copyright 1996
Lawrence C. Marsh
14.20
Copyright 1996
Lawrence C. Marsh
14.21
yt yt yt yt
f(Xt,b) [ b b b
1
2
3
yt
b1
yt
b2
=X t, 2 X t-1,2)
y t
=X t, 3 X t-1,3)
b 3
yt
= ( - b1 - b2Xt-1,2 - b3Xt-1,3+ y t-1 )
Copyright 1996
(m+1)
Lawrence C. Marsh
14.22
b1(m)
2(m)
b(m) =
yt yt yt yt
f(Xt,bm [ b b b ]
1(m)
(m)
2(m)
3(m)
b3(m)
(m)
Chapter 15
Copyright 1996
Lawrence C. Marsh15.1
Distributed
Lag Models
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh15.2
Economic action
at time t
Effect
at time t+1
Effect
at time t+2
Copyright 1996
Lawrence C. Marsh15.3
Unstructured Lags
n unstructured lags
no systematic structure imposed on the s
the s are unrestricted
Copyright 1996
Lawrence C. Marsh15.4
Copyright 1996
Lawrence C. Marsh15.5
0 = (n+1)
1 =
n
2 = (n-1)
3 = (n-2)
n-2 =
3
n-1 =
2
n =
Copyright 1996
Lawrence C. Marsh15.6
# = (n - # + 1)
Copyright 1996
Lawrence C. Marsh15.7
For n = 4:
y t = + zt + e t
Copyright 1996
Lawrence C. Marsh15.8
1 = n
2 = (n-1)
linear
.
.
.
lag
structure
n =
0
n+1
Copyright 1996
Lawrence C. Marsh15.9
i = 0 + 1i + 2i +...+ pi
For example, a quadratic polynomial:
i = 0 + 1i + 2i
where i = 1, . . . , n
p = 2 and n = 4
where i = 1, . . . , n
0
1
2
3
4
=
=
=
=
=
0
0 +
0 +
0 +
0 +
1 + 2
21 + 42
31 + 92
41 + 162
Copyright 1996
Lawrence C. Marsh
15.10
Copyright 1996
Lawrence C. Marsh
15.11
Copyright 1996
Lawrence C. Marsh
15.12
yt = + 0 z t0 + 1 z t1 + 2 z t2 + et
Step 5: Express ^is in terms of ^0 , ^1 , and ^2.
^0
^
1
^
2
^
3
^
4
= ^0
= ^0 + ^1 + ^2
^
^
^
= 0 + 21 + 42
^
^
^
= 0 + 31 + 92
^
^
^
= 0 + 41 + 162
Copyright 1996
Lawrence C. Marsh
15.13
.
.
.
1
0
.
0
Figure 15.3
Copyright 1996
Lawrence C. Marsh
15.14
yt = + 0 xt + 1 xt-1 + 2 xt-2 + . . . + et
yt = + i
xt-i + et
=0 i
(15.3.1)
i = i
Copyright 1996
Lawrence C. Marsh
15.15
Substitute i = i
0
1
2
3
=
=
=
=
...
2
3
Copyright 1996
Lawrence C. Marsh
15.16
+ +
long-run multiplier :
+ + + + . . .
Copyright 1996
Lawrence C. Marsh
15.17
0 =
.
.
1 =
2 = 2
3 = 3
4 = 4
geometrically
declining
weights
. .
3
Figure 15.5
Copyright 1996
Lawrence C. Marsh
15.18
Copyright 1996
Lawrence C. Marsh
15.19
Copyright 1996
Lawrence C. Marsh
15.20
yt = + yt-1 + xt + t
Copyright 1996
Lawrence C. Marsh
15.21
yt = + yt-1 + xt + t
The original structural
parameters can now be
estimated in terms of
these reduced form
parameter estimates.
^= ^
^ = ^
^
^= ^
Copyright 1996
Lawrence C. Marsh
15.22
^
^
yt = + xt + ^ xt-1 + ^ xt-2 + ^ xt-3 + . . .) + e^t
^
0
^
1
^
2
^
3
=
=
=
^^
^^
2
^^
3
=
...
^
^
^
^ x +
^
^ +
yt =
x
+
x
+
x
+
.
.
.
+
e
0 t
1 t-1
2 t-2
3 t-3
t
Copyright 1996
Lawrence C. Marsh
15.23
Durbins h-test
for autocorrelation
T1
1 ( T 1)[se(b2)]2
T = sample size
Copyright 1996
Lawrence C. Marsh
15.24
Adaptive Expectations
yt = + x*t + et
yt =
x*t =
Copyright 1996
Lawrence C. Marsh
15.25
Adaptive Expectations
adjust expectations
based on past realization:
Copyright 1996
Lawrence C. Marsh
15.26
Adaptive Expectations
x*t - x*t-1 = (xt-1 - x*t-1)
rearrange to get:
xt-1 =
Copyright 1996
Lawrence C. Marsh
15.27
Adaptive Expectations
yt = + x*t + et
Copyright 1996
Lawrence C. Marsh
15.28
Adaptive Expectations
yt = - (1)yt-1+ xt-1 + ut
where
ut = et - (1)et-1
Copyright 1996
Lawrence C. Marsh
15.29
Adaptive Expectations
yt = - (1)yt-1+ xt-1 + ut
Use ordinary least squares regression on:
yt = 1 + 2yt-1+ 3xt-1 + ut
and we get:
= (12)
^ =
(12)
(12)
Copyright 1996
Lawrence C. Marsh
15.30
Partial Adjustment
y*t = + xt + et
< 1,
towards optimal or desired level, y*t :
Copyright 1996
Lawrence C. Marsh
15.31
Partial Adjustment
yt = + (1 - yt-1 + xt + et
Copyright 1996
Lawrence C. Marsh
15.32
Partial Adjustment
yt = + (1 - yt-1 + xt + et
yt = 1 + 2yt-1+ 3xt + t
Use ordinary least squares regression to get:
^
^ = (1
2)
^=
(12)
(12)
Chapter 16
Copyright 1996
Lawrence C. Marsh16.1
Time
Series
Analysis
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh16.2
Copyright 1996
Lawrence C. Marsh16.3
Copyright 1996
Lawrence C. Marsh16.4
Copyright 1996
Lawrence C. Marsh16.5
Copyright 1996
Lawrence C. Marsh16.6
Copyright 1996
Lawrence C. Marsh16.7
t = 1, 2,...,T.
(16.1.1)
is the intercept.
1 is parameter generally between -1 and +1.
et is an uncorrelated random error with
mean zero and variance e .
Copyright 1996
Lawrence C. Marsh16.8
(16.1.2)
is the intercept.
is are parameters generally between -1 and +1.
et is an uncorrelated random error with
mean zero and variance e .
Copyright 1996
Lawrence C. Marsh16.9
Copyright 1996
Lawrence C. Marsh
16.10
(0.0707)
0.6515 yt-2
(0.0708)
positive
negative
Note: Q1-1948 through Q1-1978 from J.D.Cryer (1986) see unempl.dat
Copyright 1996
Lawrence C. Marsh
16.11
Copyright 1996
Lawrence C. Marsh
16.12
^
kk 1
yt = 0.5 yt-1
+ 0.3 yt-2 + et
2/ T
2/ T
k
This sample PAF suggests a second
order process AR(2) which is correct.
Copyright 1996
Lawrence C. Marsh
16.13
^
^
^y = ^ +
y
+
T+1
1 T
2 yT-1
=
0.5051 + (1.5537)(6.2)
5.8186
- (0.6515)(6.63)
^
^
^y = ^ +
y
+
2 yT
T+2
1 T+1
=
0.5051 + (1.5537)(5.8186)
5.5062
- (0.6515)(6.2)
^
^
^y = ^ +
y
+
T+1
1 T
2 yT-1
=
0.5051 + (1.5537)(5.5062)
5.2693
- (0.6515)(5.8186)
Copyright 1996
Lawrence C. Marsh
16.14
(16.2.1)
Copyright 1996
Lawrence C. Marsh
16.15
An MA(1) process:
yt = + et + 1et-1
(16.2.2)
S(,1) = t=1
e = t=1
yt - -1et-1)
2
t
(16.2.3)
Copyright 1996
Lawrence C. Marsh
16.16
nonstationary:
A nonstationary time series is one whose mean,
variance or autocorrelation function change over time.
Copyright 1996
Lawrence C. Marsh
16.17
yt = z t - z t-1
where z t is the original nonstationary series
and yt is the new stationary series.
Copyright 1996
Lawrence C. Marsh
16.18
Copyright 1996
Lawrence C. Marsh
16.19
Autocorrelation Function
Data simulated
yt = et 0.9 et-1
from this model:
This
sample
AF
suggests
a
first
order
1
rkk
process MA(1) which is correct.
2/ T
2/ T
k
rkk is the last (kth) coefficient
Copyright 1996
Lawrence C. Marsh
16.20
Copyright 1996
Lawrence C. Marsh
16.21
Integrated Processes
Copyright 1996
Lawrence C. Marsh
16.22
Copyright 1996
Lawrence C. Marsh
16.23
Unit Root
zt = 1zt -1 + + et + 1et -1
-1 < 1 < 1
1 = 1
(16.3.2)
stationary ARMA(1,1)
nonstationary process
Copyright 1996
Lawrence C. Marsh
16.24
zt = 1zt -1 + + et + 1et -1
where
zt = zt - zt -1
*
(16.3.3)
and 1 = 1- 1
Copyright 1996
Lawrence C. Marsh
16.25
H0:1 = 0
vs.
H1:1*< 0
(16.3.4)
Copyright 1996
Lawrence C. Marsh
16.26
Copyright 1996
Lawrence C. Marsh
16.27
2. Estimation
linear or nonlinear least squares.
3. Diagnostic Checking
model fits well with no autocorrelation?
4. Forecasting
short-term forecasts of future yt values.
Copyright 1996
Lawrence C. Marsh
16.28
Copyright 1996
Lawrence C. Marsh
16.29
1.
2.
3.
4.
5.
extension of AR model.
all variables endogenous.
no structural (behavioral) economic model.
all variables jointly determined (over time).
no simultaneous equations (same time).
Copyright 1996
Lawrence C. Marsh
16.30
Copyright 1996
Lawrence C. Marsh
16.31
Copyright 1996
Lawrence C. Marsh
16.32
Copyright 1996
Lawrence C. Marsh
16.33
Spurious Regressions
yt = 1+ 2 xt + t
where
t = 1 t-1 + t
-1 <1 < 1
1 = 1
Copyright 1996
Lawrence C. Marsh
16.34
Cointegration
yt = 1+ 2 xt + t
However, if
xt
and
yt
yt
are
Copyright 1996
Lawrence C. Marsh
16.35
yt = 0+ 1yt-1 + 1xt-1 + et
xt = 0+ 1yt-1 + 1xt-1 + ut
If xt and yt are both I(1) and are cointegrated,
use an Error Correction Model, instead of VAR(1).
Copyright 1996
Lawrence C. Marsh
16.36
Copyright 1996
Lawrence C. Marsh
16.37
yt = 0+ 1(yt-1 - 1- 2 xt-1) + et
*
xt = 0+ 2(yt-1 - 1- 2 xt-1) + ut
*
0= 0 + 11
*
0= 0 + 21
2=
1-
1
1
1 1
1=
1 - 1
2= 1
Copyright 1996
Lawrence C. Marsh
16.38
yt-1 - 1- 2 xt-1
Copyright 1996
Lawrence C. Marsh
16.39
yt = 0+ 1 t-1 + et
xt = 0+ 2 t-1 + ut
Copyright 1996
Lawrence C. Marsh
16.40
Chapter 17
Copyright 1996
Lawrence C. Marsh17.1
Guidelines for
Research Project
Copyright 1997 John Wiley & Sons, Inc. All rights reserved. Reproduction or translation of this work beyond
that permitted in Section 117 of the 1976 United States Copyright Act without the express written permission of the
copyright owner is unlawful. Request for further information should be addressed to the Permissions Department,
John Wiley & Sons, Inc. The purchaser may make back-up copies for his/her own use only and not for distribution
or resale. The Publisher assumes no responsibility for errors, omissions, or damages, caused by the use of these
programs or from the use of the information contained herein.
Copyright 1996
Lawrence C. Marsh17.2
Formulation
economic ====> econometric.
Estimation
selecting appropriate method.
Interpretation
how the xts impact on the yt .
Inference
testing, intervals, prediction.
Copyright 1996
Lawrence C. Marsh17.3
Copyright 1996
Lawrence C. Marsh17.4
Experimental Data
from controlled experiments.
Copyright 1996
Lawrence C. Marsh17.5
Copyright 1996
Lawrence C. Marsh17.6
Micro Data:
data collected on individual economic
decision making units such as individuals,
households or firms.
Macro Data:
data resulting from a pooling or aggregating
over individuals, households or firms at the
local, state or national levels.
Copyright 1996
Lawrence C. Marsh17.7
Flow Data:
outcome measured over a period of time,
such as the consumption of gasoline during
the last quarter of 1997.
Stock Data:
outcome measured at a particular point in
time, such as crude oil held by Chevron in
US storage tanks on April 1, 1997.
Copyright 1996
Lawrence C. Marsh17.8
car
Copyright 1996
Lawrence C. Marsh17.9
International Data
Copyright 1996
Lawrence C. Marsh
17.10
Copyright 1996
Lawrence C. Marsh
17.11
Copyright 1996
Lawrence C. Marsh
17.12
Citibase on CD-ROM
Copyright 1996
Lawrence C. Marsh
17.13
Citibase on CD-ROM
(continued)
Copyright 1996
Lawrence C. Marsh
17.14
Copyright 1996
Lawrence C. Marsh
17.15
Copyright 1996
Lawrence C. Marsh
17.16
http://seamonkey.ed.asu.edu/~behrens/teach/WWW_data.html
http://www.sims.berkeley.edu/~hal/pages/interesting.html
http://www.stls.frb.org FED RESERVE BK - ST. LOUIS
http://www.bls.gov
http://nber.harvard.edu
http://www.inform.umd.edu:8080/EdRes/Topic/EconData/.ww
w/econdata.html UNIVERSITY OF MARYLAND
http://www.bog.frb.fed.us FEB BOARD OF GOVERNORS
http://www.webcom.com/~yardeni/economic.html
Copyright 1996
Lawrence C. Marsh
17.17
Copyright 1996
Lawrence C. Marsh
17.18
Controlled Experiments
Copyright 1996
Lawrence C. Marsh
17.19
Copyright 1996
Lawrence C. Marsh
17.20
Selecting a Topic
Copyright 1996
Lawrence C. Marsh
17.21
Writing an Abstract
Copyright 1996
Lawrence C. Marsh
17.22