Sie sind auf Seite 1von 7

1)

Imagine you regressed earnings of individuals on a constant, a binary variable (Male)


which takes on the value 1 for males and is 0 otherwise, and another binary variable
(Female) which takes on the value 1 for females and is 0 otherwise. Because females
typically earn less than males, you would expect
a. the coefficient for Male to have a positive sign, and for Female a negative sign.
b. both coefficients to be the same distance from the constant, one above and the
other below.
c. none of the OLS estimators to exist because there is perfect multicollinearity.
d. this to yield a difference in means statistic.
Answer: c

2)

When you have an omitted variable problem, the assumption that E(ui | Xi) = 0 is violated.
This implies that
a.
b.
c.
d.

the sum of the residuals is no longer zero.


there is another estimator called weighted least squares, which is BLUE.
the sum of the residuals times any of the explanatory variables is no longer zero.
the OLS estimator is no longer consistent.

Answer: d
3)

If you had a two regressor regression model, then omitting one variable which is relevant
a. will have no effect on the coefficient of the included variable if the correlation
between the excluded and the included variable is negative.
b. will always bias the coefficient of the included variable upwards.
c. can result in a negative value for the coefficient of the included variable, even
though the coefficient will have a significant positive effect on Y if the omitted
variable were included.
d. makes the sum of the product between the included variable and the residuals
different from 0.
Answer: c

4)

(Requires Calculus) In the multiple regression model you estimate the effect on Yi of a
unit change in one of the Xi while holding all other regressors constant. This
a. makes little sense, because in the real world all other variables change.
b. corresponds to the economic principle of mutatis mutandis.
c. leaves the formula for the coefficient in the single explanatory variable case

unaffected.
d. corresponds to taking a partial derivative in mathematics.
5)

Answer: d
In a two regressor regression model, if you exclude one of the relevant variables then
a.
b.
c.
d.

it is no longer reasonable to assume that the errors are homoskedastic.


OLS is no longer unbiased, but still consistent.
you are no longer controlling for the influence of the other variable.
the OLS estimator no longer exists.

Answer: c
6)

Under imperfect multicollinearity


a. the OLS estimator cannot be computed.
b. two or more of the regressors are highly correlated.
c. the OLS estimator is biased even in samples of n > 100.
d. the error terms are highly, but not perfectly, correlated.
Answer: b

7)

The OLS residuals in the multiple regression model


a.
b.
c.
d.

cannot be calculated because there is more than one explanatory variable.


can be calculated by subtracting the fitted values from the actual values.
are zero because the predicted values are another name for forecasted values.
are typically the same as the population regression function errors.

Answer: b

8)

Omitted variable bias


a. will always be present as long as the regression R2 < 1
b. is always there but is negligible in almost all economic examples
c. exists if the omitted variable is correlated with the included regressor but is not a
determinant of the dependent variable
d. exists if the omitted variable is correlated with the included regressor and is a
determinant of the dependent variable
Answer: d

9)

The following OLS assumption is most likely violated by omitted variables bias:
a.
b.
c.
d.

E (ui | X i ) = 0 .
( X i , Yi ), i = 1,..., n are i.i.d draws from their joint distribution.
there are no outliers for X i , ui .
there is heteroskedasticity.

Answer: a
10)

In the multiple regression model Yi = 0 + 1 X1i + 2 X 2i + ... + k X ki + ui , i = 1,..., n , the


OLS estimators are obtained by minimizing the sum of
a. squared mistakes in
b. squared mistakes in

(Y b

b1 X 1i ... bk X ki ) 2 .

(Y b

b1 X 1i ... bk X ki ui ) 2 .

i =1
n

i =1
n

c. absolute mistakes in | (Yi b0 b1 X 1i ... bk X ki ) | .


d. squared mistakes in

i =1
n

(Y b
i

b1 X i ) 2 .

i =1

Answer: a
11)

In multiple regression, the R 2 increases whenever a regressor is


a.
b.
c.
d.

added unless the coefficient on the added regressor is exactly zero.


added.
added unless there is heterosckedasticity.
greater than 1.96 in absolute value.

Answer: a
12)

All of the following are examples of joint hypotheses on multiple regression coefficients,
with the exception of
a. H 0 : 1 + 2 = 1.

3
= 1 and 4 = 0 .
2
c. H 0 : 2 = 0 and 3 = 0 .
d. H 0 : 1 = 2 and 1 + 2 = 1.
b. H 0 :

Answer: a
2
2
13)
Let Runrestricted
and Rrestricted
be 0.4366 and 0.4149 respectively. The difference between the
unrestricted and the restricted model is that you have imposed two restrictions. There are 420
observations. The F-statistic in this case is

a.
b.
c.
d.

4.61.
8.01.
10.34.
7.71.

Answer: b
14)

If you wanted to test, using a 5% significance level, whether or not a specific slope
coefficient is equal to one, then you should
a. subtract 1 from the estimated coefficient, divide the difference by the standard
error, and check if the resulting ratio is larger than 1.96.
b. add and subtract 1.96 from the slope and check if that interval includes 1.
c. see if the slope coefficient is between 0.95 and 1.05.
d. check if the adjusted R2 is close to 1.

15)

Answer: a
A 95% confidence set for two or more coefficients is a set that contains
a.
b.
c.
d.

the sample values of these coefficients in 95% of randomly drawn samples.


integer values only.
the same values as the 95% confidence intervals constructed for the coefficients.
the population values of these coefficients in 95% of randomly drawn samples.

Answer: d
16)

When testing the null hypothesis that two regression slopes are zero simultaneously, then
you cannot reject the null hypothesis at the 5% level, if the ellipse contains the point
a.
b.
c.
d.

(-1.96, 1.96).
|(0, 1.96)|.
(0,0).
(1.962, 1.962).

Answer: c
17)

At a mathematical level, if the two conditions for omitted variable bias are satisfied, then
a. E (ui | X1i , X 2i ,..., X ki ) 0 .

b. there is perfect multicollinearity.


c. large outliers are likely: X1i , X 2i ,..., X ki and Yi have infinite fourth moments.
d. ( X1i , X 2i ,..., X ki , Yi ), i = 1,..., n are not i.i.d. draws from their joint distribution.
Answer: a
18)

All of the following are true, with the exception of one condition:
2

a. a high R 2 or R does not mean that the regressors are a true cause of the
dependent variable.
2

b. a high R 2 or R does not mean that there is no omitted variable bias.


2
c. a high R 2 or R always means that an added variable is statistically significant.
2

d. a high R 2 or R does not necessarily mean that you have the most appropriate set
of regressors.
Answer: c
19)

Including an interaction term between two independent variables, X1 and X 2 , allows for
the following, except that:
b. the interaction term lets the effect on Y of a change in X1 depend on the value of
X2.
c. the interaction term coefficient is the effect of a unit increase in X1 and X 2 above
and beyond the sum of the individual effects of a unit increase in the two
variables alone.
d. the interaction term coefficient is the effect of a unit increase in ( X1 X 2 ) .
e. the interaction term lets the effect on Y of a change in X 2 depend on the value of
X1 .
Answer: c

20)

In the model Yi = 0 + 1 X1 + 2 X 2 + 3 ( X1 X 2 ) + ui , the expected effect


a.
b.
c.
d.

1 + 3 X 2 .
1 .
1 + 3 .
1 + 3 X1 .

Answer: a

Y
is
X1

1)

You have obtained data on test scores and student-teacher ratios in region A and region B
of your state. Region B, on average, has lower student-teacher ratios than region A. You
decide to run the following regression

Yi = 0 + 1 X 1i + 2 X 2i + 3 X 3i + ui
where X1 is the class size in region A, X2 is the difference in class size between region A
and B, and X3 is the class size in region B. Your regression package shows a message
indicating that it cannot estimate the above equation. What is the problem here and how
can it be fixed?
Answer: There is perfect multicollinearity present since one of the three explanatory
variables can always be expressed linearly in terms of the other two. Hence
there are not really three pieces of independent information contained in the
three explanatory variables. Dropping one of the three will solve the problem.
2) Consider the following multiple linear regression model:
Yi = 0 + 1 X 1i + 2 X 2i + ui
Suppose you have data on the relevant variables but your econometrics software is only
capable of running simple linear regressions (a dependent variable on one independent variable)
and not multiple linear regressions (a dependent variable on multiple independent variables).
Explain the sequence of regressions that could be run to estimate ! .
Answer: 1) regress X1 on X2 and a constant, capture residuals R1
2) regress Y on X2 and a constant, capture residuals R2
3) regress R2 on R1 and a constant, the slope coefficient will be the unbiased
OLS estimator for !
3) Set up the null hypothesis and alternative hypothesis carefully for the following cases:
(a)

k = 4, test for all coefficients other than the intercept to be zero


Answer: H 0 : 1 = 0, 2 = 0, 3 = 0, 4 = 0
H1: at least one not equal

(b)

k = 3, test for the slope coefficient of X 1 to be unity, and the coefficients on the other
explanatory variables to be zero
Answer: H 0 : 1 = 1, 2 = 0, 3 = 0
H1: at least one not equal

(c)

k = 10, test for the slope coefficient of X 1 to be zero, and for the slope coefficients of X 2
and X 3 to be the same but of opposite sign.
Answer: H 0 : 1 = 0, 2 + 3 = 0
H1: at least one not equal

(d)

k = 4, test for the slope coefficients to add up to unity


Answer: H 0 : 1 + 2 + 3 + 4 = 1
H1: sum not equal

4)

Suggest a transformation in the variables that will linearize the deterministic part of the
population regression functions below. Write the resulting regression function in a form
that can be estimated by using OLS.

(a)

Yi = 0 X1i 1 X 2i

Answer: ln(Yi ) = ln( 0 ) + 1 ln( X1i ) + 2 ln( X 2i )


(b)

Yi =

Xi
0 + 1 X i

Answer:

1
1
= 0
+ 1
Yi
Xi

Das könnte Ihnen auch gefallen