Statistics Papers

Document Type

Journal Article

Date of this Version

1989

Publication Source

The Annals of Statistics

Volume

17

Issue

1

Start Page

252

Last Page

267

DOI

10.1214/aos/1176347014

Abstract

Assume the standard linear model

Xn×1 = An×p θp×1 + εn×1, where ε has an n-variate normal distribution with zero mean vector and identity covariance matrix. The least squares estimator for the coefficient θ is θ^ ≡ (AA)−1AX. It is well known that θ^ is dominated by James-Stein type estimators under the sum of squared error loss |θθ^|2 when p ≥ 3.

In this article we discuss the possibility of improving upon θ^, simultaneously under the "universal" class of losses:

{L(|θ - θ^|) : L (.) any nondecreasing function}

An estimator that can be so improved is called universally inadmissible (U-inadmissible). Otherwise it is called U-admissible.

We prove that θ^ is U-admissible for any p when AA = I. Furthermore, if AA I, then θ^ is U-inadmissible if p is "large enough." In a special case, p ≥ 4 is large enough. The results are surprising. Implications are discussed.

Keywords

decision theory under a broad class of loss functions, James-Stein positive part estimator, admissibility

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.