Statistics Papers

Document Type

Journal Article

Date of this Version

5-2017

Publication Source

The Annals of Statistics

Volume

45

Issue

2

Start Page

615

Last Page

646

DOI

10.1214/16-AOS1461

Abstract

Confidence sets play a fundamental role in statistical inference. In this paper, we consider confidence intervals for high-dimensional linear regression with random design. We first establish the convergence rates of the minimax expected length for confidence intervals in the oracle setting where the sparsity parameter is given. The focus is then on the problem of adaptation to sparsity for the construction of confidence intervals. Ideally, an adaptive confidence interval should have its length automatically adjusted to the sparsity of the unknown regression vector, while maintaining a pre-specified coverage probability. It is shown that such a goal is in general not attainable, except when the sparsity parameter is restricted to a small region over which the confidence intervals have the optimal length of the usual parametric rate. It is further demonstrated that the lack of adaptivity is not due to the conservativeness of the minimax framework, but is fundamentally caused by the difficulty of learning the bias accurately.

Copyright/Permission Statement

The original and published work is available at: https://projecteuclid.org/euclid.aos/1494921952#abstract

Keywords

Adaptivity, confidence interval, coverage probability, expected length, high-dimensional linear regression, minimaxity, sparsity

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.