Statistics Papers

Document Type

Journal Article

Date of this Version

2004

Publication Source

Annals of Statistics

Volume

32

Issue

4

Start Page

1723

Last Page

1743

DOI

10.1214/009053604000000454

Abstract

The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infinite-order Sobolev space and the space of analytic functions, the method under consideration is asymptotically minimax; in finite-order Sobolev spaces, the method is rate optimal, and the efficiency in terms of constant when compared with the minimax estimator is reasonably high. The smoothing parameters in the periodic Gaussian regularization can be chosen adaptively without loss of asymptotic efficiency. The results derived in this paper give a partial explanation of the success of the Gaussian reproducing kernel in practice. Simulations are carried out to study the finite sample properties of the periodic Gaussian regularization.

Keywords

asymptotic minimax risk, Gaussian reproducing kernel, nonparametric estimation, rate of convergence, Sobolev spaces, white noise model

Share

COinS
 

Date Posted: 27 November 2017

This document has been peer reviewed.