Autocorrelation
Autocorrelation (sometimes called serial correlation) occurs when one of the Gauss-Markov assumptions fails and the error terms are correlated.
i.e. .
This can be due to a variety of problems, but the main cause is when an important variable has been omitted from the regression. In the presence of autocorrelation the estimator is no longer BLUE, as the estimator is not the best. In this case the t-statistics and other tests are no longer valid.
Testing for Autocorrelation
To test for first order autocorrelation, we use the Durbin-Watson (DW) d statistic. Given the following 1st order process:
The d statistic is roughly: d = 2 - 2ρ , where ρ lies between +1 and -1. This statistic lies between 0 and 4. Having calculated the DW statistic, you need to go to the Tables to find the critical values, which take the form of a lower and upper statistic. To determine if there is autocorrelation or not, you need to put the values into the following framework:
If the DW d-statistic is between du and 4-du, there is no first order autocorrelation. If it is below dL or above (4-dL) then autocorrelation is present.
Detection of higher order autocorrelation: The Lagrange Multiplier Test
The Lagrange Multiplier test is used for detecting autocorrelation of the more general
form such as 2nd or 4th order autocorrelation, and the test is executed as follows:
i) First decide on the order of autocorrelation that you want to test, say 2;
ii) Run the usual OLS regression of y against the explanatory variable x.
(2)
and save the residuals; ut
iii. Run a regression using the residuals from step ii as the dependent variable
against the explanatory variable xt, (as in ii) and also lagged variables of u (depending on the order of the autocorrelation, in this case 2 lags)
(3)
.iv. Calculate TR2 for this regression (total number of observations multiplied by the R2 value). Under the null hypothesis of no autocorrelation, this statistic has a (chi-squared) distribution with s (number of lags on error term) degrees of freedom. (in this case 2, which has a critical value of 5.99).There are two important points regarding the Lagrange Multiplier test: firstly, it ,is a large sample test, so caution 'is needed in interpreting results from a small sample; and secondly, it detects not only autoregressive autocorrelation but also moving average autocorrelation. Again caution is needed in interpreting the results.
Cochrane-Orcutt and Unrestricted models for remedying autocorrelation
Given the following model, suffering from first order autocorrelation
Lag the first equation (4a) and multiply by ρ:
This last equation (d) is the generalised difference equation, formed by taking (4c) from (4a), used in the Cochrane-Orcutt approach to eliminating the autocorrelation. Cochrane and Orcutt then recommend the following steps to estimate ρ:
1. Estimate the two-variable model. with the original untransformed data by the
standard OLS routine and obtain the residuals, ut.
2. Using the estimated residuals. regress ut against ut-1 ,the coefficient of ut-1 being an
estimate of ρ.
3. Using ρ obtained in step 2, transform the data and run the generalised difference
equation. namely equation (4d) above:
(4e)
4. Collect the residual (ut*) from the new generalised difference equation and repeat step 1 to 3 until there is no further change in the parameters.
Autocorrelation may be due to the omission of an important explanatory variable,
such as a particular lag structure. The Cochrane-Orcutt Procedure is a way of
introducing a specialised lag structure. The generalised difference equation used in the
Cochrane-Orcutt (C-O) Procedure (step 3), can be rewritten as :
(5a)
This equation effectively imposes the restriction that the coefficient of xt-1 is equal to minus the product of the coefficients of the other two explanatory variables. This restriction may not be true, the following more general equation may be appropriate:
(5b)
This is an unrestricted version of the C-O difference equation above. To test if the
restriction is valid we need to test the restriction:
.To test if the restriction holds:
I) Run equation (5a), using the C-O procedure and collect the Residual Sum of
Squares (RSSRes) (restricted equation).
2) Run equation (5b), using OLS and collect RSSUnres (unrestricted equation)
We want to test the null hypothesis H0:
The test statistic for the common factor test (Hendry-Mizon Test): is:
Where T are the number of observations and the Residual Sum of Squares are from the restricted and unrestricted regressions. The test statistic follows a chi-squared
distribution, with degrees of freedom equal to the number of restrictions (1 in this case). If the restriction is not rejected, we use the C-O procedure, if rejected fit 5b.
N.B. The RSS or Residual Sum of Squares is simply the sum of the residuals squared.