Regularisation Parameter Selection Via Bootstrapping

Regularisation Parameter Selection Via Bootstrapping

0.00 Avg rating0 Votes
Article ID: iaor20163406
Volume: 58
Issue: 3
Start Page Number: 335
End Page Number: 356
Publication Date: Sep 2016
Journal: Australian & New Zealand Journal of Statistics
Authors: , ,
Keywords: statistics: regression, simulation: applications
Abstract:

Penalised likelihood methods, such as the least absolute shrinkage and selection operator (Lasso) and the smoothly clipped absolute deviation penalty, have become widely used for variable selection in recent years. These methods impose penalties on regression coefficients to shrink a subset of them towards zero to achieve parameter estimation and model selection simultaneously. The amount of shrinkage is controlled by the regularisation parameter. Popular approaches for choosing the regularisation parameter include cross‐validation, various information criteria and bootstrapping methods that are based on mean square error. In this paper, a new data‐driven method for choosing the regularisation parameter is proposed and the consistency of the method is established. It holds not only for the usual fixed‐dimensional case but also for the divergent setting. Simulation results show that the new method outperforms other popular approaches. An application of the proposed method to motif discovery in gene expression analysis is included in this paper.

Reviews

Required fields are marked *. Your email address will not be published.