Inference for Linear Regression with Autocorrelated Errors: Why Cochrane Orcutt Procedure Should Not Be Recommended
Positive autocorrelation can inflate type I error in tests for significance of the linear regression slope parameter in time series data. The ordinary least squares (OLS) regression parameter estimators are not best linear unbiased estimators in the presence of autocorrelation. In practice it is necessary to estimate both correlation and regression parameters. This process can result in estimators with larger mean squared error (MSE) than that of the OLS estimator. Popular textbooks recommend Cochrane Orcutt (CO) procedure as a better method to estimate the slope coefficient in these settings. For smaller samples, we observe tests carried out based CO estimates for significance of the slope parameter provide unacceptably high type I error probabilities. In a model with linear trend and first order autoregressive errors, the OLS estimator of slope has competitive MSE relative to other procedures. Using bias corrected estimates of correlation, we improve the estimate of standard error of the OLS estimated slope. Equivalent degrees of freedom calculated from these bias corrected correlation estimators are used to stabilize the type I error rate of tests significance for the slope.
Joint Statistical Meetings (JSM)
"Inference for Linear Regression with Autocorrelated Errors: Why Cochrane Orcutt Procedure Should Not Be Recommended."
Mathematical Sciences Faculty Presentations.