Regression assumptions
Posted: Tue Mar 05, 2013 7:42 am
Hello to everybody
Berry (1993) states a number of assumptions to be satisfied in regression analysis(underneath). My concern is assumption 8: the residuals are normally distributed. In many cases they are not.
My questions are:
A) What is the remedy to non-normal distribution of the residuals in EViews?
B) If a remedy is not necessary, what is the theoretical justification?
Regression assumptions:
1.“All independent variables (X1, X2… Xk) are quantitative or dichotomous and the dependent variable, Y, is quantitative, continuous, and unbounded. Moreover, all variables are measured without error.”
2. “All independent variables have nonzero variance (i.e., each independent variable has some variation in value.”
3. “There is not perfect multicollinearity (i.e., there is no exact linear relationship between two or more of the independent variables)”.
4. “At each set of values for the k independent variables, (X1j, X2j... Xkj), E (ɛj|X1j, X2j… Xkj) = 0 (i.e. the mean value of the error term is zero).”
5. “For each Xi, COV(Xij, ɛj) = 0 (i.e., each independent variable is uncorrelated with the error term).”
6. “For each set of values for the k independent variables, (X1j, X2j… Xkj), VAR (ɛj|X1j, X2j… Xkj) = σ2, where σ2 is a constant (i.e., the conditional variance of the error term is constant); this is known as the assumption of homoscedasticity.”
7. "For any two observations, (X1j, X2j,…, Xkj) and (X1h, X2h,…, Xkh), COV(ɛj, ɛh) = 0 (i.e., error terms for different observations are uncorrelated); this assumption is known as lack of autocorrelation.”
8. “At each set of values for the k independent variables, ɛj is normally distributed.”
Berry (1993) states a number of assumptions to be satisfied in regression analysis(underneath). My concern is assumption 8: the residuals are normally distributed. In many cases they are not.
My questions are:
A) What is the remedy to non-normal distribution of the residuals in EViews?
B) If a remedy is not necessary, what is the theoretical justification?
Regression assumptions:
1.“All independent variables (X1, X2… Xk) are quantitative or dichotomous and the dependent variable, Y, is quantitative, continuous, and unbounded. Moreover, all variables are measured without error.”
2. “All independent variables have nonzero variance (i.e., each independent variable has some variation in value.”
3. “There is not perfect multicollinearity (i.e., there is no exact linear relationship between two or more of the independent variables)”.
4. “At each set of values for the k independent variables, (X1j, X2j... Xkj), E (ɛj|X1j, X2j… Xkj) = 0 (i.e. the mean value of the error term is zero).”
5. “For each Xi, COV(Xij, ɛj) = 0 (i.e., each independent variable is uncorrelated with the error term).”
6. “For each set of values for the k independent variables, (X1j, X2j… Xkj), VAR (ɛj|X1j, X2j… Xkj) = σ2, where σ2 is a constant (i.e., the conditional variance of the error term is constant); this is known as the assumption of homoscedasticity.”
7. "For any two observations, (X1j, X2j,…, Xkj) and (X1h, X2h,…, Xkh), COV(ɛj, ɛh) = 0 (i.e., error terms for different observations are uncorrelated); this assumption is known as lack of autocorrelation.”
8. “At each set of values for the k independent variables, ɛj is normally distributed.”