Streamwise feature selection

Jing Zhou, University of Pennsylvania

Abstract

In streamwise feature selection, new features are sequentially considered for addition to a predictive model. When the space of potential features is large, streamwise feature selection offers many advantages over traditional feature selection methods, which assume that all features are known in advance. Features can be generated dynamically, focusing the search for new features on promising subspaces, and overfitting can be controlled by dynamically adjusting the threshold for adding features to the model. In contrast to traditional forward feature selection algorithms such as stepwise regression in which at each step all possible features are evaluated and the best one is selected, streamwise feature selection only evaluates each feature once when it is generated. We describe information-investing and α-investing, two adaptive complexity penalty methods for streamwise feature selection which dynamically adjust the threshold on the error reduction required for adding a new feature. These two methods give false discovery rate style guarantees against overfitting. They differ from standard penalty methods such as AIC, BIC and RIC, which always drastically over- or under-fit in the limit of infinite numbers of non-predictive features. Empirical results show that strearnwise regression is competitive with (on small data sets) and superior to (on large data sets) much more compute-intensive feature selection methods such as stepwise regression, and allows feature selection on problems with millions of potential features. When doing feature selection in multiple simultaneous regressions, one can "borrow strength" across the different regressions to get a more sensitive criterion for deciding which features to include in which regressions. We use information theory to derive the Multiple Inclusion Criterion (MIC), an efficient coding scheme, in a stepwise or strearnwise feature selection. Each feature can be added to none; some, or all of the regression models. Experiments show that the MIC approach is useful for selecting a small set of features when predicting multiple responses from the same set of potential features.

Subject Area

Computer science

Recommended Citation

Zhou, Jing, "Streamwise feature selection" (2006). Dissertations available from ProQuest. AAI3246259.
https://repository.upenn.edu/dissertations/AAI3246259

Share

COinS