On Variational Bayes Estimation and Variational Information Criteria for Linear Regression Models

On Variational Bayes Estimation and Variational Information Criteria for Linear Regression Models

0.00 Avg rating0 Votes
Article ID: iaor201524998
Volume: 56
Issue: 1
Start Page Number: 73
End Page Number: 87
Publication Date: Mar 2014
Journal: Australian & New Zealand Journal of Statistics
Authors: , ,
Keywords: statistics: distributions, statistics: inference
Abstract:

Variational Bayes (VB) estimation is a fast alternative to Markov Chain Monte Carlo for performing approximate Baesian inference. This procedure can be an efficient and effective means of analyzing large datasets. However, VB estimation is often criticised, typically on empirical grounds, for being unable to produce valid statistical inferences. In this article we refute this criticism for one of the simplest models where Bayesian inference is not analytically tractable, that is, the Bayesian linear model (for a particular choice of priors). We prove that under mild regularity conditions, VB based estimators enjoy some desirable frequentist properties such as consistency and can be used to obtain asymptotically valid standard errors. In addition to these results we introduce two VB information criteria: the variational Akaike information criterion and the variational Bayesian information criterion. We show that variational Akaike information criterion is asymptotically equivalent to the frequentist Akaike information criterion and that the variational Bayesian information criterion is first order equivalent to the Bayesian information criterion in linear regression. These results motivate the potential use of the variational information criteria for more complex models. We support our theoretical results with numerical examples.

Reviews

Required fields are marked *. Your email address will not be published.