Date of this Version
We present an information theoretic approach to feature selection when the data possesses feature classes. Feature classes are pervasive in real data. For example, in gene expression data, the genes which serve as features may be divided into classes based on their membership in gene families or pathways. When doing word sense disambiguation or named entity extraction, features fall into classes including adjacent words, their parts of speech, and the topic and venue of the document the word is in. When predictive features occur predominantly in a small number of feature classes, our information theoretic approach significantly improves feature selection. Experiments on real and synthetic data demonstrate substantial improvement in predictive accuracy over the standard L0 penalty-based stepwise and stream wise feature selection methods as well as over Lasso and Elastic Nets, all of which are oblivious to the existence of feature classes.
feature extraction, pattern classification, feature selection, features extraction, gene expression data, information theoretic approach, multiple feature classes, word sense disambiguation, Minimum Description Length Coding
Paramveer Singh Dhillon, Dean P. Foster, and Lyle H. Ungar, "Efficient Feature Selection in the Presence of Multiple Feature Classes", . December 2008.
Date Posted: 29 June 2009