Article ID: | iaor19972199 |
Country: | South Korea |
Volume: | 21 |
Issue: | 1 |
Start Page Number: | 163 |
End Page Number: | 170 |
Publication Date: | Apr 1996 |
Journal: | Journal of the Korean ORMS Society |
Authors: | Kim Kil-Soo |
An errors-in-variables model (EVM) differs from the classical regression model in that in the former the independent variable is also subject to error. This paper shows that to assess the applicability of the ordinary least squares (OLS) estimation procedure to the EVM, the relative dispersion of the independent variable to its error variance must be also considered in addition to Mandel’s criterion. The effect of physically reducing the variance of errors in the independent variable on the performance of the OLS slope estimator is also discussed.