A Reproducing Kernel Hilbert Space Approach to Functional Linear Regression

Loading...
Thumbnail Image
Penn collection
Statistics Papers
Degree type
Discipline
Subject
covariance
eigenfunction
eigenvalue
functional linear regression
minimax
optimal convergence rate
principal component analysis
reproducing kernel Hilbert space
Sacks–Ylvisaker conditions
simultaneous diagonalization
slope function
Sobolev space
Statistics and Probability
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Yuan, Ming
Cai, T. Tony
Contributor
Abstract

We study in this paper a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on simultaneous diagonalization of two positive definite kernels, we obtain shaper results on the minimax rates of convergence and show that smoothness regularized estimators achieve the optimal rates of convergence for both prediction and estimation under conditions weaker than those for the functional principal components based methods developed in the literature. Despite the generality of the method of regularization, we show that the procedure is easily implementable. Numerical results are obtained to illustrate the merits of the method and to demonstrate the theoretical developments.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2010-01-01
Journal title
The Annals of Statistics
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection