Statistics Papers

Document Type

Conference Paper

Date of this Version

2005

Publication Source

Advances in Neural Information Processing Systems

Volume

18

Abstract

It is well-known that everything that is learnable in the difficult online setting, where an arbitrary sequences of examples must be labeled one at a time, is also learnable in the batch setting, where examples are drawn independently from a distribution. We show a result in the opposite direction. We give an efficient conversion algorithm from batch to online that is transductive: it uses future unlabeled data. This demonstrates the equivalence between what is properly and efficiently learnable in a batch model and a transductive online model.

Share

COinS
 

Date Posted: 27 November 2017