Sie sind auf Seite 1von 1

DIAGNOSING Heteroscedasticib - a problem in

which the variance of the residuals along a trend


MODEL PROBLEMS line is not constant.
NOTE: the coefficient (~) .remains an unbiased
Multicollineariw - a problem in which estimatornonetheless.
two independent variables in a model are highly PROBLEMS caused by Heteroscedasticity:
related. Multicollinearity is a sample 1} The coefficient (lJ) is not "efficient" or the
phenomenon, not an estimation problem. "best" estimator.
NOTE: If all of the independentvariablesare checked, 2} The F-test is no longer reliable.
and the t-tests are significant, then multicollinearity 3} The t-tests are no longer reliable.
shouldnot be a problem.
CHECKINGfor Heteroscedasticity:
PROBLEMS caused by Multicollinearity: 1} Check the scatterplot of residuals for fan
1} The estimated' coefficient (lJ) may no longer shaped patterns.
be an unbiased estimators. 2} Goldfeld-Quandt test
2) Because of the strong interrelationships 3} Breusch-Pagan test
between the independent variables, it is difficult 4} White test
to disentangle their separate effects on the
dependent variable without violating the cetelis POSSIBLE SOLUTIONS to Heteroscedasticity:
paribus assumption. . 1} Generalized Least Squares
3) Significant variables. may appear to be 2} Weighted Least Squares
insignificant. (i.e., a Type II error) 3} Use weighting via generalized differencing if
the true variance is known.
CHECKINGfor Multicollinearity: 4} Perform a transformation of the model from
f} High R2 with few orno significant t-ratios. linear to log-linear form.
2} Variance Inflation Factor (V1F) Autocorrelation _ a problem in which
If V1F> 10, then suspect a problem.
=
V1F 1 is IDEAL!!! .the residuals are not independent. This
3A} & 3B} should be performed jointly. problem is often. seen in time-series data. It
3A) Analysis of Structure (SAS Procedure) may be the result of an omitted relevant.
3B} Condition Index (Q) variable" or an incorrect functional form.
If CI > 30, then suspect a problem. NOTE: may be positivelyor negativelyrelated.
NOTE: the coefficient (~) remains an unbiased
4} Variance Proportion estimatornonetheless.
If two or more Variance Proportions> 0.5 in a
single row, then suspect a problem. caused by Autocorrelation:
5} High pairwise correlations among 1} The variance of the coefficient (lJ) may be
independent variables. understated.
2} The F-test is no longer reliable.
POSSIBLE SOLUTIONS to Multicollinearity: 3} The t-tests are no longer reliable.
1} Dropping the redundant independent for Autocorrelation:
variable; however, be careful not to omit a 1} Check the scatterplot of residuals for signs of
relevant var,'able." <See Autocorrelatl'on> patterns. (positive or negative autocorrelation)
2} Combining highly correlated variables 2} Durbin-Watson statistic
(NOTE: use of interaction variables.)
3) Using ratios on a first difference; however this to Autocorrelation:
may introduce a Heteroscedasticity problem. . 1} Generalized differencing (if p is known)
4} Using ridge regression. 2) The Cochran-Orcutt procedure
5} Using principle component analysis. 3) The Hildreth-Lu procedure
6) Getting more data. 4) Durbin's H test - USE only when there is a
7) Do Nothing! Live with the lesser of two evils. * lagged dependent variable.
<See Autocorrelation definition for warning> @ Roger E. Wehr 1999

Das könnte Ihnen auch gefallen