Improved Minimax Predictive Densities Under Kullback-Leibler Loss

Loading...
Thumbnail Image
Penn collection
Statistics Papers
Degree type
Discipline
Subject
Bayes rules
hear equation
inadmissibility
multiple shrinkage
multivariate normal
prior distributions
shrinkage estimation
superharmonic marginals
superharmonic priors
unbiased estimate of risk
Physical Sciences and Mathematics
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
George, Edward I
Liang, Feng
Xu, Xinyi
Contributor
Abstract

Let X|μ∼Np(μ,vxI) and Y|μ∼Np(μ,vyI) be independent p-dimensional multivariate normal vectors with common unknown mean μ. Based on only observing X=x, we consider the problem of obtaining a predictive density p̂(y|x) for Y that is close to p(y|μ) as measured by expected Kullback–Leibler loss. A natural procedure for this problem is the (formal) Bayes predictive density p̂U(y|x) under the uniform prior πU(μ)≡1, which is best invariant and minimax. We show that any Bayes predictive density will be minimax if it is obtained by a prior yielding a marginal that is superharmonic or whose square root is superharmonic. This yields wide classes of minimax procedures that dominate p̂U(y|x), including Bayes predictive densities under superharmonic priors. Fundamental similarities and differences with the parallel theory of estimating a multivariate normal mean under quadratic loss are described.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2006-05-01
Journal title
The Annals of Statistics
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection