Online Nonparametric Regression

Loading...
Thumbnail Image

Related Collections

Degree type

Discipline

Subject

Physical Sciences and Mathematics

Funder

Grant number

License

Copyright date

Distributor

Related resources

Contributor

Abstract

We establish optimal rates for online regression for arbitrary classes of regression functions in terms of the sequential entropy introduced in [14]. The optimal rates are shown to exhibit a phase transition analogous to the i.i.d./statistical learning case, studied in [16]. In the frequently encountered situation when sequential entropy and i.i.d. empirical entropy match, our results point to the interesting phenomenon that the rates for statistical learning with squared loss and online nonparametric regression are the same. In addition to a non-algorithmic study of minimax regret, we exhibit a generic forecaster that enjoys the established optimal rates. We also provide a recipe for designing online regression algorithms that can be computationally efficient. We illustrate the techniques by deriving existing and new forecasters for the case of finite experts and for online linear regression.

Advisor

Date of presentation

2014-02-12

Conference name

Statistics Papers

Conference dates

2023-05-17T17:51:25.000

Conference location

Date Range for Data Collection (Start Date)

Date Range for Data Collection (End Date)

Digital Object Identifier

Series name and number

Volume number

Issue number

Publisher

Publisher DOI

Journal Issues

Comments

Recommended citation

Collection