Statistics Papers

Document Type

Journal Article

Date of this Version

2010

Publication Source

The Annals of Statistics

Volume

38

Issue

6

Start Page

3412

Last Page

3444

DOI

10.1214/09-AOS772

Abstract

We study in this paper a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on simultaneous diagonalization of two positive definite kernels, we obtain shaper results on the minimax rates of convergence and show that smoothness regularized estimators achieve the optimal rates of convergence for both prediction and estimation under conditions weaker than those for the functional principal components based methods developed in the literature. Despite the generality of the method of regularization, we show that the procedure is easily implementable. Numerical results are obtained to illustrate the merits of the method and to demonstrate the theoretical developments.

Keywords

covariance, eigenfunction, eigenvalue, functional linear regression, minimax, optimal convergence rate, principal component analysis, reproducing kernel Hilbert space, Sacks–Ylvisaker conditions, simultaneous diagonalization, slope function, Sobolev space

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.