Beruflich Dokumente
Kultur Dokumente
Y 2 3X2 X3
X3 2X2 1
X2
X3
10
19
51
11
21
56
12
23
61
13
25
66
14
27
71
15
29
76
Suppose that Y = 2 + 3X2 + X3 and that X3 = 2X2 1. There is no disturbance term in the
equation for Y, but that is not important. Suppose that we have the six observations shown.
1
MULTICOLLINEARITY
80
Y
70
60
50
40
30
X3
20
X2
10
0
1
The three variables are plotted as line graphs above. Looking at the data, it is impossible to
tell whether the changes in Y are caused by changes in X2, by changes in X3, or jointly by
changes in both X2 and X3.
2
MULTICOLLINEARITY
Y 2 3X2 X3
X3 2X2 1
Change from previous observation
X2
X3
X2
X3
10
19
51
11
21
56
12
23
61
13
25
66
14
27
71
15
29
76
MULTICOLLINEARITY
80
Y
70
60
50
Y = 1 + 5X2 ?
40
30
X3
20
X2
10
0
1
MULTICOLLINEARITY
Y 2 3X2 X3
X3 2X2 1
Change from previous observation
X2
X3
X2
X3
10
19
51
11
21
56
12
23
61
13
25
66
14
27
71
15
29
76
MULTICOLLINEARITY
80
Y
70
60
50
Y = 3.5 + 2.5X3 ?
40
30
X3
20
X2
10
0
1
MULTICOLLINEARITY
80
Y
70
60
50
40
30
X3
20
X2
10
0
1
These two possibilities are special cases of Y = 3.5 2.5p + 5pX2 + 2.5(1 p)X3, which would
fit the relationship for any value of p.
7
MULTICOLLINEARITY
80
Y
70
60
50
40
30
X3
20
X2
10
0
1
There is no way that regression analysis, or any other technique, could determine the true
relationship from this infinite set of possibilities, given the sample data.
8
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
What would happen if you tried to run a regression when there is an exact linear
relationship among the explanatory variables?
9
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
We will investigate, using the model with two explanatory variables shown above. [Note: A
disturbance term has now been included in the true model, but it makes no difference to the
analysis.]
10
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
X
Y
Y
X
X
2i 2 i 3i 3
X 3 i X 3 Yi Y X 2 i X 2 X 3 i X 3
b2
2
2
2
X 2i X 2 X 3i X 3 X 2i X 2 X 3i X 3
The expression for the multiple regression coefficient b2 is shown above. We will substitute
for X3 using its relationship with X2.
11
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
X
Y
Y
X
X
2i 2 i 3i 3
X 3 i X 3 Yi Y X 2 i X 2 X 3 i X 3
b2
2
2
2
X 2i X 2 X 3i X 3 X 2i X 2 X 3i X 3
2
2
X 3i X 3 [ X 2i ] [ X 2 ]
X 2 i X 2 2 X 2 i X 2
2
2 X 2i X 2
12
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
X 3 i X 3 Yi Y X 2 i X 2 X 3 i X 3
b2
2
2 2
2
X 2i X 2 X 2i X 2 X 2i X 2 X 3i X 3
2
2
X 3i X 3 [ X 2i ] [ X 2 ]
X 2 i X 2 2 X 2 i X 2
2
2 X 2i X 2
13
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
X 3 i X 3 Yi Y X 2 i X 2 X 3 i X 3
b2
2
2 2
2
X 2i X 2 X 2i X 2 X 2i X 2 X 3i X 3
2i
X 2 X 3 i X 3 X 2 i X 2 [ X 2 i ] [ X 2 ]
X 2 i X 2 X 2 i X 2
X 2i X 2
14
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
2
X 3 i X 3 Yi Y X 2 i X 2
b2
2 2
2
2 2
X 2i X 2 X 2i X 2 X 2i X 2
0
X2i 0 X 2 X 3i X 3 X 2i X 2 [ X 2i ] [ X 2 ]
X 2 i X 2 X 2 i X 2
X 2i X 2
15
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
2
X 3 i X 3 Yi Y X 2 i X 2
b2
2 2
2
2 2
X 2i X 2 X 2i X 2 X 2i X 2
0
X03i X 3 Yi Y [ X 2i ] [ X 2 ]Yi Y
X 2 i X 2 Yi Y
X 2 i X 2 Yi Y
Finally this term.
16
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
2
X 2 i X 2 Yi Y X 2 i X 2
b2
2 2
2
2 2
X 2i X 2 X 2i X 2 X 2i X 2
0
X03i X 3 Yi Y [ X 2i ] [ X 2 ]Yi Y
X 2 i X 2 Yi Y
X 2 i X 2 Yi Y
Again, we have made the replacement.
17
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
2
X 2 i X 2 Yi Y X 2 i X 2
b2
2 2
2
2 2
X 2i X 2 X 2i X 2 X 2i X 2
It turns out that the numerator and the denominator are both equal to zero. The regression
coefficient is not defined.
18
MULTICOLLINEARITY
Y 1 2 X 2 3 X 3 u
X 3 X 2
2
2
X
Y
X
2i 2 i
2i 2
2
X 2 i X 2 Yi Y X 2 i X 2
b2
2 2
2
2 2
X 2i X 2 X 2i X 2 X 2i X 2
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
MULTICOLLINEARITY
. reg EARNINGS S EXP
Source |
SS
df
MS
-------------+-----------------------------Model | 22513.6473
2 11256.8237
Residual | 89496.5838
537 166.660305
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 2,
537)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
67.54
0.0000
0.2010
0.1980
12.91
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.678125
.2336497
11.46
0.000
2.219146
3.137105
EXP |
.5624326
.1285136
4.38
0.000
.3099816
.8148837
_cons | -26.48501
4.27251
-6.20
0.000
-34.87789
-18.09213
------------------------------------------------------------------------------
23
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
MULTICOLLINEARITY
. reg EARNINGS S EXP
Source |
SS
df
MS
-------------+-----------------------------Model | 22513.6473
2 11256.8237
Residual | 89496.5838
537 166.660305
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 2,
537)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
67.54
0.0000
0.2010
0.1980
12.91
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.678125
.2336497
11.46
0.000
2.219146
3.137105
EXP |
.5624326
.1285136
4.38
0.000
.3099816
.8148837
_cons | -26.48501
4.27251
-6.20
0.000
-34.87789
-18.09213
------------------------------------------------------------------------------
26
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
Source |
SS
df
MS
-------------+-----------------------------Model | 22762.4472
3 7587.48241
Residual | 89247.7839
536 166.507059
-------------+-----------------------------Total | 112010.231
539 207.811189
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
-----------------------------------------------------------------------------. reg EARNINGS S EXP
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.678125
.2336497
11.46
0.000
2.219146
3.137105
EXP |
.5624326
.1285136
4.38
0.000
.3099816
.8148837
_cons | -26.48501
4.27251
-6.20
0.000
-34.87789
-18.09213
------------------------------------------------------------------------------
The high correlation causes the standard error of EXP to be larger than it would have been
if EXP and EXPSQ had been less highly correlated, warning us that the point estimate is
unreliable.
29
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
-----------------------------------------------------------------------------. reg EARNINGS S EXP
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.678125
.2336497
11.46
0.000
2.219146
3.137105
EXP |
.5624326
.1285136
4.38
0.000
.3099816
.8148837
_cons | -26.48501
4.27251
-6.20
0.000
-34.87789
-18.09213
------------------------------------------------------------------------------
When high correlations among the explanatory variables lead to erratic point estimates of
the coefficients, large standard errors and unsatisfactorily low t statistics, the regression is
said to said to be suffering from multicollinearity.
30
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
-----------------------------------------------------------------------------. reg EARNINGS S EXP
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.678125
.2336497
11.46
0.000
2.219146
3.137105
EXP |
.5624326
.1285136
4.38
0.000
.3099816
.8148837
_cons | -26.48501
4.27251
-6.20
0.000
-34.87789
-18.09213
------------------------------------------------------------------------------
Note that the coefficients remain unbiased and the standard errors remain valid.
31
MULTICOLLINEARITY
. reg EARNINGS S EXP EXPSQ
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
-----------------------------------------------------------------------------. reg EARNINGS S EXP
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.678125
.2336497
11.46
0.000
2.219146
3.137105
EXP |
.5624326
.1285136
4.38
0.000
.3099816
.8148837
_cons | -26.48501
4.27251
-6.20
0.000
-34.87789
-18.09213
------------------------------------------------------------------------------
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
What can you do about multicollinearity if you encounter it? We will discuss some possible
measures, looking at the model with two explanatory variables.
1
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
Before doing this, two important points should be emphasized. First, multicollinearity does
not cause the regression coefficients to be biased.
2
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
Second, the standard errors and t tests remain valid. The standard errors are larger than
they would have been in the absence of multicollinearity, warning us that the regression
estimates are erratic.
4
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
Since the problem of multicollinearity is caused by the variances of the coefficients being
unsatisfactorily large, we will seek ways of reducing them.
5
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(1)
Reduce
We will focus on the slope coefficient and look at the various components of its variance.
We might be able to reduce it by bringing more variables into the model and reducing u2,
the variance of the disturbance term.
6
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.754372
.2417286
11.39
0.000
2.279521
3.229224
EXP | -.2353907
.665197
-0.35
0.724
-1.542103
1.071322
EXPSQ |
.0267843
.0219115
1.22
0.222
-.0162586
.0698272
_cons | -22.21964
5.514827
-4.03
0.000
-33.05297
-11.38632
------------------------------------------------------------------------------
The estimator of the variance of the disturbance term is the residual sum of squares divided
by n k, where n is the number of observations (540) and k is the number of parameters (4).
Here it is 166.5.
7
Number of obs
F( 5,
534)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
37.24
0.0000
0.2585
0.2516
12.471
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.031419
.296218
6.86
0.000
1.449524
2.613315
EXP | -.0816828
.6441767
-0.13
0.899
-1.347114
1.183748
EXPSQ |
.0130223
.021334
0.61
0.542
-.0288866
.0549311
MALE |
5.762358
1.104734
5.22
0.000
3.592201
7.932515
ASVABC |
.2447687
.0714294
3.43
0.001
.1044516
.3850858
_cons | -26.18541
5.452032
-4.80
0.000
-36.89547
-15.47535
------------------------------------------------------------------------------
We now add two new variables that are often found to be determinants of earnings: MALE,
sex of respondent, and ASVABC, the composite score on the cognitive tests in the Armed
Services Vocational Aptitude Battery.
8
Number of obs
F( 5,
534)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
37.24
0.0000
0.2585
0.2516
12.471
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.031419
.296218
6.86
0.000
1.449524
2.613315
EXP | -.0816828
.6441767
-0.13
0.899
-1.347114
1.183748
EXPSQ |
.0130223
.021334
0.61
0.542
-.0288866
.0549311
MALE |
5.762358
1.104734
5.22
0.000
3.592201
7.932515
ASVABC |
.2447687
.0714294
3.43
0.001
.1044516
.3850858
_cons | -26.18541
5.452032
-4.80
0.000
-36.89547
-15.47535
------------------------------------------------------------------------------
MALE is a qualitative variable and the treatment of such variables will be explained in
Chapter 5.
9
Number of obs
F( 5,
534)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
37.24
0.0000
0.2585
0.2516
12.471
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.031419
.296218
6.86
0.000
1.449524
2.613315
EXP | -.0816828
.6441767
-0.13
0.899
-1.347114
1.183748
EXPSQ |
.0130223
.021334
0.61
0.542
-.0288866
.0549311
MALE |
5.762358
1.104734
5.22
0.000
3.592201
7.932515
ASVABC |
.2447687
.0714294
3.43
0.001
.1044516
.3850858
_cons | -26.18541
5.452032
-4.80
0.000
-36.89547
-15.47535
------------------------------------------------------------------------------
Both MALE and ASVABC have coefficients significant at the 0.1% level.
10
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
45.57
0.0000
0.2032
0.1988
12.904
Number of obs
F( 5,
534)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
37.24
0.0000
0.2585
0.2516
12.471
However they account for only a small proportion of the variance in earnings and the
reduction in the estimate of the variance of the disturbance term is likewise small.
11
As a consequence the impact on the standard errors of EXP and EXPSQ is negligible.
12
Note how unstable the coefficients are. This is often a sign of multicollinearity.
13
Note also that the standard error of the coefficient of S has actually increased. This is
attributable to the correlation of 0.58 between S and ASVABC.
14
This is a common problem with this approach to attempting to reduce the problem of
multicollinearity. If the new variables are linearly related to one or more of the variables
already in the equation, their inclusion may make the problem of multicollinearity worse.
15
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(2)
The next factor to look at is n, the number of observations. If you are working with crosssection data (individuals, households, enterprises, etc) and you are undertaking a survey,
you could increase the size of the sample by negotiating a bigger budget.
16
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(2)
Alternatively, you could make a fixed budget go further by using a technique known as
clustering. You divide the country geographically by zip code or postal area.
17
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(2)
You select a number of these randomly, perhaps using stratified random sampling to make
sure that metropolitan, other urban, and rural areas are properly represented.
18
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(2)
You then confine the survey to the areas selected. This reduces the travel time and cost of
the fieldworkers, allowing them to interview a greater number of respondents.
19
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(2)
If you are working with time series data, you may be able to increase the sample by working
with shorter time intervals for the data, for example quarterly or even monthly data instead
of annual data.
20
Number of obs
F( 5, 2708)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
2714
183.99
0.0000
0.2536
0.2522
13.262
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.312461
.135428
17.08
0.000
2.046909
2.578014
EXP | -.3270651
.308231
-1.06
0.289
-.9314569
.2773268
EXPSQ |
.023743
.0101558
2.34
0.019
.0038291
.0436569
MALE |
5.947206
.5221755
11.39
0.000
4.923303
6.971108
ASVABC |
.2086846
.0336869
6.19
0.000
.1426301
.2747392
_cons | -27.40462
2.579435
-10.62
0.000
-32.46248
-22.34676
------------------------------------------------------------------------------
Here is the result of running the regression with all 2,714 observations in the EAEF data set.
21
Comparing this result with that using Data Set 21, we see that the standard errors are much
smaller, as expected.
22
As a consequence, the t statistics of the variables are higher. However the correlation
between EXP and EXPSQ is as high as in the smaller sample and the increase in the sample
size has not been large enough to have much impact on the problem of multicollinearity.
23
The coefficients of EXP and EXPSQ both still have unexpected signs since we expect the
coefficient of EXP to be positive and that of EXPSQ to be negative, reflecting diminishing
returns.
24
assume that this has occurred as a matter of chance. Alternatively, it might be an indication
that the model is misspecified.
25
As we will see in the next and subsequent chapters, there are good reasons for supposing
that the dependent variable in an earnings function should be the logarithm of earnings,
rather than earnings in linear form.
26
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(3)
Increase MSD(X2).
A third possible way of reducing the problem of multicollinearity might be to increase the
variation in the explanatory variables. This is possible only at the design stage of a survey.
27
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(3)
Increase MSD(X2).
For example, if you were planning a household survey with the aim of investigating how
expenditure patterns vary with income, you should make sure that the sample included
relatively rich and relatively poor households as well as middle-income households.
28
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(4)
Reduce
rX 2 , X 3 .
Another possibility might be to reduce the correlation between the explanatory variables.
This is possible only at the design stage of a survey and even then it is not easy.
29
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(5)
If the correlated variables are similar conceptually, it may be reasonable to combine them
into some overall index.
30
Number of obs
F( 5,
534)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
37.24
0.0000
0.2585
0.2516
12.471
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.031419
.296218
6.86
0.000
1.449524
2.613315
EXP | -.0816828
.6441767
-0.13
0.899
-1.347114
1.183748
EXPSQ |
.0130223
.021334
0.61
0.542
-.0288866
.0549311
MALE |
5.762358
1.104734
5.22
0.000
3.592201
7.932515
ASVABC |
.2447687
.0714294
3.43
0.001
.1044516
.3850858
_cons | -26.18541
5.452032
-4.80
0.000
-36.89547
-15.47535
------------------------------------------------------------------------------
That is precisely what has been done with the three cognitive ASVAB variables. ASVABC
has been calculated as a weighted average of ASVAB02 (arithmetic reasoning), ASVAB03
(word knowledge), and ASVAB04 (paragraph comprehension).
31
Number of obs
F( 5,
534)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
37.24
0.0000
0.2585
0.2516
12.471
-----------------------------------------------------------------------------EARNINGS |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------S |
2.031419
.296218
6.86
0.000
1.449524
2.613315
EXP | -.0816828
.6441767
-0.13
0.899
-1.347114
1.183748
EXPSQ |
.0130223
.021334
0.61
0.542
-.0288866
.0549311
MALE |
5.762358
1.104734
5.22
0.000
3.592201
7.932515
ASVABC |
.2447687
.0714294
3.43
0.001
.1044516
.3850858
_cons | -26.18541
5.452032
-4.80
0.000
-36.89547
-15.47535
------------------------------------------------------------------------------
The three components are highly correlated and by combining them as a weighted average,
rather than using them individually, one avoids a potential problem of multicollinearity.
32
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(6)
Dropping some of the correlated variables, if they have insignificant coefficients, may
alleviate multicollinearity.
33
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(6)
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(6)
If that is the case, their omission may cause omitted variable bias, to be discussed in
Chapter 6.
35
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
For example, suppose that Y in the equation above is the demand for a category of
consumer expenditure, X is aggregate disposable personal income, and P is a price index
for the category.
37
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
To fit a model of this type you would use time series data. If X and P are highly correlated,
which is often the case with time series variables, the problem of multicollinearity might be
eliminated in the following way.
38
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
Obtain data on income and expenditure on the category from a household survey and
regress Y' on X'. (The ' marks are to indicate that the data are household data, not
aggregate data.)
39
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
This is a simple regression because there will be relatively little variation in the price paid
by the households.
40
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
Y 1 b2' X 3 P u
Z Y b2' X 1 2 P u
Now substitute b'2 for 2 in the time series model. Subtract b'2X from both sides, and regress
Z = Y b'2 X on price. This is a simple regression, so multicollinearity has been eliminated.
41
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
Y 1 b2' X 3 P u
Z Y b2' X 1 2 P u
There are some problems with this technique. First, the 2 coefficients may be conceptually
different in time series and cross-section contexts.
42
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(7)
Empirical restriction
Y 1 2 X 3 P u
Y 1 b2' X 3 P u
Z Y b2' X 1 2 P u
Second, since we subtract the estimated income component b'2 X, not the true income
component 2X, from Y when constructing Z, we have introduced an element of
measurement error in the dependent variable.
43
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(8)
Theoretical restriction
Last, but by no means least, is the use of a theoretical restriction, which is defined as a
hypothetical relationship among the parameters of a regression model.
44
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(8)
Theoretical restriction
S 1 2 ASVABC 3 SM 4 SF u
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
104.30
0.0000
0.3686
0.3651
1.943
-----------------------------------------------------------------------------S |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------ASVABC |
.1257087
.0098533
12.76
0.000
.1063528
.1450646
SM |
.0492424
.0390901
1.26
0.208
-.027546
.1260309
SF |
.1076825
.0309522
3.48
0.001
.04688
.1684851
_cons |
5.370631
.4882155
11.00
0.000
4.41158
6.329681
------------------------------------------------------------------------------
46
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
104.30
0.0000
0.3686
0.3651
1.943
-----------------------------------------------------------------------------S |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------ASVABC |
.1257087
.0098533
12.76
0.000
.1063528
.1450646
SM |
.0492424
.0390901
1.26
0.208
-.027546
.1260309
SF |
.1076825
.0309522
3.48
0.001
.04688
.1684851
_cons |
5.370631
.4882155
11.00
0.000
4.41158
6.329681
------------------------------------------------------------------------------
S increases by 0.05 years for every extra year of schooling of the mother and 0.11 years for
every extra year of schooling of the father.
47
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
104.30
0.0000
0.3686
0.3651
1.943
-----------------------------------------------------------------------------S |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------ASVABC |
.1257087
.0098533
12.76
0.000
.1063528
.1450646
SM |
.0492424
.0390901
1.26
0.208
-.027546
.1260309
SF |
.1076825
.0309522
3.48
0.001
.04688
.1684851
_cons |
5.370631
.4882155
11.00
0.000
4.41158
6.329681
------------------------------------------------------------------------------
Mother's education is generally held to be at least, if not more, important than father's
education for educational attainment, so this outcome is unexpected.
48
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
104.30
0.0000
0.3686
0.3651
1.943
-----------------------------------------------------------------------------S |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------ASVABC |
.1257087
.0098533
12.76
0.000
.1063528
.1450646
SM |
.0492424
.0390901
1.26
0.208
-.027546
.1260309
SF |
.1076825
.0309522
3.48
0.001
.04688
.1684851
_cons |
5.370631
.4882155
11.00
0.000
4.41158
6.329681
------------------------------------------------------------------------------
It is also surprising that the coefficient of SM is not significant, even at the 5% level, using a
one-sided test.
49
Number of obs
F( 3,
536)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
104.30
0.0000
0.3686
0.3651
1.943
-----------------------------------------------------------------------------S |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------ASVABC |
.1257087
.0098533
12.76
0.000
.1063528
.1450646
SM |
.0492424
.0390901
1.26
0.208
-.027546
.1260309
SF |
.1076825
.0309522
3.48
0.001
.04688
.1684851
_cons |
5.370631
.4882155
11.00
0.000
4.41158
6.329681
-----------------------------------------------------------------------------. cor SM SF
(obs=540)
|
SM
SF
--------+-----------------SM |
1.0000
SF |
0.6241
1.0000
However assortive mating leads to correlation between SM and SF and the regression
appears to be suffering from multicollinearity.
50
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(8)
Theoretical restriction
S 1 2 ASVABC 3 SM 4 SF u
3 4
Suppose that we hypothesize that mother's and father's education are equally important.
We can then impose the restriction 3 = 4.
51
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(8)
Theoretical restriction
S 1 2 ASVABC 3 SM 4 SF u
3 4
S 1 2 ASVABC 3 ( SM SF ) u
1 2 ASVABC 3 SP u
52
u2
1
1
2
u
b2
2
2
2
1
r
n
MSD
(
X
)
1
X
X2 ,X3
2
X2 ,X3
2i 2
(8)
Theoretical restriction
S 1 2 ASVABC 3 SM 4 SF u
3 4
S 1 2 ASVABC 3 ( SM SF ) u
1 2 ASVABC 3 SP u
Defining SP to be the sum of SM and SF, the equation may be rewritten as shown. The
problem caused by the correlation between SM and SF has been eliminated.
53
Number of obs
F( 2,
537)
Prob > F
R-squared
Adj R-squared
Root MSE
=
=
=
=
=
=
540
156.04
0.0000
0.3675
0.3652
1.9429
-----------------------------------------------------------------------------S |
Coef.
Std. Err.
t
P>|t|
[95% Conf. Interval]
-------------+---------------------------------------------------------------ASVABC |
.1253106
.0098434
12.73
0.000
.1059743
.1446469
SP |
.0828368
.0164247
5.04
0.000
.0505722
.1151014
_cons |
5.29617
.4817972
10.99
0.000
4.349731
6.242608
------------------------------------------------------------------------------
54
The standard error of SP is much smaller than those of SM and SF. The use of the
restriction has led to a large gain in efficiency and the problem of multicollinearity has been
eliminated.
56
The t statistic is very high. Thus it would appear that imposing the restriction has improved
the regression results. However, the restriction may not be valid. We should test it. Testing
theoretical restrictions is one of the topics in Chapter 6.
57