Minimax and Adaptive Inference in Nonparametric Function Estimation
Penn collection
Degree type
Discipline
Subject
adaptive estimation
Bayes minimax
Besov ball
block thresholding
confidence interval
ellipsoid
information pooling
linear functional
linear minimaxity
minimax
nonparametric regression
oracle
separable rules
sequence model
shrinkage
thresholding
wavelet
white noise model
Statistics and Probability
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
Since Stein’s 1956 seminal paper, shrinkage has played a fundamental role in both parametric and nonparametric inference. This article discusses minimaxity and adaptive minimaxity in nonparametric function estimation. Three interrelated problems, function estimation under global integrated squared error, estimation under pointwise squared error, and nonparametric confidence intervals, are considered. Shrinkage is pivotal in the development of both the minimax theory and the adaptation theory. While the three problems are closely connected and the minimax theories bear some similarities, the adaptation theories are strikingly different. For example, in a sharp contrast to adaptive point estimation, in many common settings there do not exist nonparametric confidence intervals that adapt to the unknown smoothness of the underlying function. A concise account of these theories is given. The connections as well as differences among these problems are discussed and illustrated through examples.