f-entropies, probability of error, and feature selection

Moshe Ben-Bassat*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

55 Scopus citations

Abstract

The f{hook}-entropy family of information measures: u(P1,..., pm) = Σf(pk), f{hook} concave (e.g., Shannon (1948)Bell Syst. Tech. J. 27, 379-423, 623-656; Suadratic; Daroczy (1970)Inform. Contr. 16, 36-51; etc.), is considered. Characterization of the tightest upper and lower bounds on f{hook}-entropies by means of the probability of error, is presented. These bounds are used to derive the dual bounds, i.e., the tightest lower and upper bounds on the probability of error by means of f{hook}-entropies. Concerning the use of f{hook}-entropies as a tool for feature selection, it is proved that none of the members of this family induce over an arbitrary set of features the same preference order as does the probability of error rule.

Original languageEnglish
Pages (from-to)227-242
Number of pages16
JournalInformation and control
Volume39
Issue number3
DOIs
StatePublished - Dec 1978
Externally publishedYes

Funding

FundersFunder number
Health Resources and Services Administration
U.S. Public Health ServiceRO1HS01474

    Fingerprint

    Dive into the research topics of 'f-entropies, probability of error, and feature selection'. Together they form a unique fingerprint.

    Cite this