Anytime algorithm for feature selection

Mark Last, Abraham Kandel, Oded Maimon, Eugene Eberbach

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Feature selection is used to improve performance of learning algorithms by finding a minimal subset of relevant features. Since the process of feature selection is computationally intensive, a trade-off between the quality of the selected subset and the computation time is required. In this paper, we are presenting a novel, anytime algorithm for feature selection, which gradually improves the quality of results by increasing the computation time. The algorithm is interruptible, i.e., it can be stopped at any time and provide a partial subset of selected features. The quality of results is monitored by a new measure: fuzzy information gain. The algorithm performance is evaluated on several benchmark datasets.

Original languageEnglish
Title of host publicationRough Sets and Current Trends in Computing - 2nd International Conference, RSCTC 2000, Revised Papers
EditorsWojciech Ziarko, Yiyu Yao
PublisherSpringer Verlag
Pages532-539
Number of pages8
ISBN (Print)3540430741, 9783540430742
DOIs
StatePublished - 2001
Event2nd International Conference on Rough Sets and Current Trends in Computing, RSCTC 2000 - Banff, Canada
Duration: 16 Oct 200019 Oct 2000

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume2005
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference2nd International Conference on Rough Sets and Current Trends in Computing, RSCTC 2000
Country/TerritoryCanada
CityBanff
Period16/10/0019/10/00

Keywords

  • Anytime algorithms
  • Feature selection
  • Fuzzy information gain
  • Information-theoretic network

Fingerprint

Dive into the research topics of 'Anytime algorithm for feature selection'. Together they form a unique fingerprint.

Cite this