Monday 28 November 2011

Heteroskedasticity

Heteroskedasticity occurs when the variance of the errors differs across observations i.e. variances are not constant.

Heteroskedasticity could of be two types:
1. Unconditional Heteroskedasticity: When variance does not systematically increase or decrease with changes in the value of independent variable. It is violation of assumption 4 but does not upholds any serious problems with regression.
2. Conditional Heteroskedasticiy: It exists when error variance changes with the value of independent variable and it is more problematic.

Consequences of (conditional) Heteroskedasticity:
  • It does not affect consistency but it can lead to wrong inferences.
  • Coefficient estimates are not affected.
  • It causes the F-test for the overall significance to be unreliable.
  • It introduces bias into estimators of the standard error of regression coefficients; thus t-tests for the significance of individual regression coefficients are unreliable.
    • When Heteroskedasticity results in underestimated standard errors, t-statistics are inflated and probability of Type-1 error increases.
    • When Heteroskedasticity results in overestimated standard errors, t-statistics are deflated and probability of Type-2 error increases.
Testing for Heteroskedasticity:
Heteroskedasticity can be tested by Plotting residuals on a graph and judging a relationship with respect to observations on the x-axis. A more stringent measure is the Breush-Pagan Test which involves regressing the squared residuals from the estimated regression equation on the independent variables in the regression. 
  • Null Hypothesis = No conditional Heteroskedasticity exists.
  • Alternative Hypothesis = Conditional Heteroskedasticity exists.
Test statistic = n x R2Residuals

Critical value can be calculated from chi-square distribution table with degree of freedom = no. of independent variables (k)

If test statistic > critical value reject null hypothesis and conclude that conditional Heteroskedasticity exists in the regression model.

Correcting for Heteroskedasticity: 
Two different methods can be used to correct Heteroskedasticity.

1. Computing robust standard errors corrects the standard errors of the linear regression model's estimated coefficients to deal with conditional heteroskedasticity. 
2. Generalized least squares (GLS) method is used to modify the original equation in order to eliminate the heteroskedasticity.

No comments:

Post a Comment