Departmental Papers (CIS)

Document Type

Journal Article

Date of this Version

October 2007


Postprint version. Published in Machine Learning, Volume 68, Issue 3, October 2007, pages 235-265.


Which active learning methods can we expect to yield good performance in learning binary and multi-category logistic regression classifiers? Addressing this question is a natural first step in providing robust solutions for active learning across a wide variety of exponential models including maximum entropy, generalized linear, log-linear, and conditional random field models. For the logistic regression model we re-derive the variance reduction method known in experimental design circles as 'A-optimality.' We then run comparisons against different variations of the most widely used heuristic schemes: query by committee and uncertainty sampling, to discover which methods work best for different classes of problems and why. We find that among the strategies tested, the experimental design methods are most likely to match or beat a random sample baseline. The heuristic alternatives produced mixed results, with an uncertainty sampling variant called margin sampling providing the most promising performance at very low computational cost. Computational running times of the experimental design methods were a bottleneck to the evaluations. Meanwhile, evaluation of the heuristic methods lead to an accumulation of negative results. Such results demonstrate a need for improved active learning methods that will provide reliable performance at a reasonable computational cost.



Date Posted: 14 January 2008

This document has been peer reviewed.