Statistics Papers

Document Type

Journal Article

Date of this Version

12-2000

Publication Source

Biometrika

Volume

87

Issue

4

Start Page

731

Last Page

747

DOI

10.1093/biomet/87.4.731

Abstract

For the problem of variable selection for the normal linear model, selection criteria such as AIC, Cp, BIC and RIC have fixed dimensionality penalties. Such criteria are shown to correspond to selection of maximum posterior models under implicit hyperparameter choices for a particular hierarchical Bayes formulation. Based on this calibration, we propose empirical Bayes selection criteria that use hyperparameter estimates instead of fixed choices. For obtaining these estimates, both marginal and conditional maximum likelihood methods are considered. As opposed to traditional fixed penalty criteria, these empirical Bayes criteria have dimensionality penalties that depend on the data. Their performance is seen to approximate adaptively the performance of the best fixed‐penalty criterion across a variety of orthogonal and nonorthogonal set‐ups, including wavelet regression. Empirical Bayes shrinkage estimators of the selected coefficients are also proposed.

Copyright/Permission Statement

This is a post-peer-review, pre-copyedit version of an article published in Biometrika.

Keywords

AIC, BIC, conditional likelihood, Cp, hierarchical model, marginal likelihood, model selection, RIC, Risk, Selection bias, shrinkage estimation, wavelets

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.