Feature selection by combining multiple methods

Lior Rokach*, Barak Chizi, Oded Maimon

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

19 Scopus citations

Abstract

Feature selection is the process of identifying relevant features in the dataset and discarding everything else as irrelevant and redundant. Since feature selection reduces the dimensionality of the data, it enables the learning algorithms to operate more effectively and rapidly. In some cases, classification performance can be improved; in other instances, the obtained classifier is more compact and can be easily interpreted. There is much work done on feature selection methods for creating ensemble of classifiers. Thus, these works examine how feature selection can help ensemble of classifiers to gain diversity. This paper examines a different direction, i.e. whether ensemble methodology can be used for improving feature selection performance. In this paper we present a general framework for creating several feature subsets and then combine them into a single subset. Theoretical and empirical results presented in this paper validate the hypothesis that this approach can help finding a better feature subset.

Original languageEnglish
Title of host publicationAdvances in Web Intelligence and Data Mining
EditorsMark Last, Piotr Szczepaniak, Piotr Szczepaniak, Zeev Vlvolkov, Abraham Kandel
PublisherSpringer Berlin Heidelberg
Pages295-304
Number of pages10
ISBN (Print)3540338799, 9783540338796
DOIs
StatePublished - 2006

Publication series

NameStudies in Computational Intelligence
Volume23
ISSN (Print)1860-949X

Fingerprint

Dive into the research topics of 'Feature selection by combining multiple methods'. Together they form a unique fingerprint.

Cite this