Relations between entropy and error probability

Meir Feder*, Neri Merhav

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano's inequality provides a tight lower bound on the error probability in terms of the entropy, we derive a converse result - a tight upper bound on the minimal error probability in terms of the entropy. As a consequence of this relation, a channel coding theorem for the equivocation is presented. At a rate R < C, where C is the channel capacity, it follows straightforwardly from the classical channel coding theorem and the bounds above that the equivocation can be made arbitrarily small (exponentially fast with the block length). This result is proved directly for DMC's, and from this proof it is further concluded that for R ≥ C the equivocation achieve its minimal value of R - C at the rate of n- 1/2 , where n is the block length.

Original languageEnglish
Title of host publicationProceedings of the 1993 IEEE International Symposium on Information Theory
PublisherPubl by IEEE
Pages72
Number of pages1
ISBN (Print)0780308786
StatePublished - 1993
EventProceedings of the 1993 IEEE International Symposium on Information Theory - San Antonio, TX, USA
Duration: 17 Jan 199322 Jan 1993

Publication series

NameProceedings of the 1993 IEEE International Symposium on Information Theory

Conference

ConferenceProceedings of the 1993 IEEE International Symposium on Information Theory
CitySan Antonio, TX, USA
Period17/01/9322/01/93

Fingerprint

Dive into the research topics of 'Relations between entropy and error probability'. Together they form a unique fingerprint.

Cite this