SCAD-penalized quantile regression for high-dimensional data analysis and variable selection

SCAD-penalized quantile regression for high-dimensional data analysis and variable selection

0.00 Avg rating0 Votes
Article ID: iaor201526420
Volume: 69
Issue: 3
Start Page Number: 212
End Page Number: 235
Publication Date: Aug 2015
Journal: Statistica Neerlandica
Authors: , , ,
Keywords: simulation: applications
Abstract:

The present penalized quantile variable selection methods are only applicable to finite number of predictors or do not have oracle property associated with estimator. This technique is considered as an alternative to ordinary least squares regression in case of the outliers and the heavy‐tailed errors existing in linear models. The variable selection through quantile regression with diverging number of parameters is investigated in this paper. The convergence rate of estimator with smoothly clipped absolute deviation penalty function is also studied. Moreover, the oracle property with proper selection of tuning parameter for quantile regression under certain regularity conditions is also established. In addition, the rank correlation screening method is used to accommodate ultra‐high dimensional data settings. Monte Carlo simulations demonstrate finite performance of the proposed estimator. The results of real data reveal that this approach provides substantially more information as compared with ordinary least squares, conventional quantile regression, and quantile lasso.

Reviews

Required fields are marked *. Your email address will not be published.