Date of this Version
For the problem of variable selection for the normal linear model, selection criteria such as AIC, Cp, BIC and RIC have fixed dimensionality penalties. Such criteria are shown to correspond to selection of maximum posterior models under implicit hyperparameter choices for a particular hierarchical Bayes formulation. Based on this calibration, we propose empirical Bayes selection criteria that use hyperparameter estimates instead of fixed choices. For obtaining these estimates, both marginal and conditional maximum likelihood methods are considered. As opposed to traditional fixed penalty criteria, these empirical Bayes criteria have dimensionality penalties that depend on the data. Their performance is seen to approximate adaptively the performance of the best fixed‐penalty criterion across a variety of orthogonal and nonorthogonal set‐ups, including wavelet regression. Empirical Bayes shrinkage estimators of the selected coefficients are also proposed.
This is a post-peer-review, pre-copyedit version of an article published in Biometrika.
AIC, BIC, conditional likelihood, Cp, hierarchical model, marginal likelihood, model selection, RIC, Risk, Selection bias, shrinkage estimation, wavelets
George, E. I., & Foster, D. P. (2000). Calibration and Empirical Bayes Variable Selection. Biometrika, 87 (4), 731-747. http://dx.doi.org/10.1093/biomet/87.4.731
Date Posted: 27 November 2017
This document has been peer reviewed.