Efficient Feature Selection in the Presence of Multiple Feature Classes
Penn collection
Degree type
Discipline
Subject
pattern classification
feature selection
features extraction
gene expression data
information theoretic approach
multiple feature classes
word sense disambiguation
Minimum Description Length Coding
Funder
Grant number
License
Copyright date
Distributor
Related resources
Contributor
Abstract
We present an information theoretic approach to feature selection when the data possesses feature classes. Feature classes are pervasive in real data. For example, in gene expression data, the genes which serve as features may be divided into classes based on their membership in gene families or pathways. When doing word sense disambiguation or named entity extraction, features fall into classes including adjacent words, their parts of speech, and the topic and venue of the document the word is in. When predictive features occur predominantly in a small number of feature classes, our information theoretic approach significantly improves feature selection. Experiments on real and synthetic data demonstrate substantial improvement in predictive accuracy over the standard L0 penalty-based stepwise and stream wise feature selection methods as well as over Lasso and Elastic Nets, all of which are oblivious to the existence of feature classes.