MSE-improvement of the least squares estimator by dropping variables

MSE-improvement of the least squares estimator by dropping variables

0.00 Avg rating0 Votes
Article ID: iaor19942039
Country: Germany
Volume: 40
Start Page Number: 263
End Page Number: 269
Publication Date: Oct 1993
Journal: Metrika
Authors: ,
Abstract:

It is well known that dropping variables in regression analysis decreases the variance of the least squares (LS) estimator of the remaining parameters. However, after elimination estimates of these parameters are biased, if the full model is correct. In this recent paper, Boscher showed that the LS-estimator in the special case of a mean shift model (cf. Cook and Weisberg) which assumes no ‘outliers’ can be considered in the framework of a linear regression model where some variables are deleted. He derived conditions under which this estimator outperforms the LS-estimator of the full model in terms of the mean squared error (MSE)-matrix criterion. The authors demonstrate that this approach can be extended to the general set-up of dropping variables. Necessary and sufficient conditions for the MSE-matrix superiority of the LS-estimator in the reduced model over that in the full model are derived. The authors also provide a uniformly most powerful F-statistic for testing the MSE-improvement.

Reviews

Required fields are marked *. Your email address will not be published.