Relations Between Entropy and Error Probability

Meir Feder, Neri Merhav

Research output: Contribution to journalArticlepeer-review

Abstract

The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano’s inequality provides a tight lower bound on the error probability in terms of the entropy, we derive a converse result—a tight upper bound on the minimal error probability in terms of the entropy. Both bounds are sharp, and can draw a relation, as well, between the error probability for the maximum a posteriori (MAP) rule, and the conditional entropy (equivocation), which is a useful uncertainty measure in several applications. Combining this relation and the classical channel coding theorem, we present a channel coding theorem for the equivocation which, unlike the channel coding theorem for error probability, is meaningful at all rates. This theorem is proved directly for DMC’s, and from this proof it is further concluded that for R> C the equivocation achieves its minimal value of R — C at the rate of n1/2where n is the block length.

Original languageEnglish
Pages (from-to)259-266
Number of pages8
JournalIEEE Transactions on Information Theory
Volume40
Issue number1
DOIs
StatePublished - Jan 1994

Keywords

  • Entropy
  • Fano's inequality
  • channel coding theorem
  • equivocation
  • error probability
  • predictability

Fingerprint

Dive into the research topics of 'Relations Between Entropy and Error Probability'. Together they form a unique fingerprint.

Cite this