The 1985 Wald Memorial Lectures: An Ancillarity Paradox Which Appears in Multiple Linear Regression
Penn collection
Degree type
Discipline
Subject
ancillary statistics
multiple linear regression
Statistics and Probability
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
Consider a multiple linear regression in which Yi, i=1,⋯, n, are independent normal variables with variance σ2 and E (Yi) = α+V′iβ, where Vi ∈ Rr and β ∈ Rr. Let α^ denote the usual least squares estimator of α. Suppose that Vi are themselves observations of independent multivariate normal random variables with mean 0 and known, nonsingular covariance matrix θ. Then α^ is admissible under squared error loss if r ≥ 2. Several estimators dominating α^ when r ≥ 3 are presented. Analogous results are presented for the case where σ2 or θ are unknown and some other generalizations are also considered. It is noted that some of these results for r≥3 appear in earlier papers of Baranchik and of Takada. {Vi} are ancillary statistics in the above setting. Hence admissibility of α^ depends on the distribution of the ancillary statistics, since if {Vi} is fixed instead of random, then α^ is admissible. This fact contradicts a widely held notion about ancillary statistics; some interpretations and consequences of this paradox are briefly discussed.