A new upper bound on the reliability function of the gaussian channel

Alexei E. Ashikhmin, Alexander Barg, Simon N. Litsyn

Research output: Contribution to journalArticlepeer-review


We derive a new upper bound on the exponent of error probability of decoding for the best possible codes in the Gaussian channel. This bound is tighter than the known upper bounds (the sphere-packing and minimum-distance bounds proved in Shannon's classical 1959 paper and their low-rate improvement by Kabatiansky and Levenshtein). The proof is accomplished by studying asymptotic properties of codes on the sphere S~1(R). First we prove a general lower bound on the distance distribution of codes of large size. To derive specific estimates of the distance distribution, we study the asymptotic behavior of Jacobi polynomials P£ ' as k -> oo. Since on the average there are many code vectors in the vicinity of the transmitted vector x, one can show that the probability of confusing x and one of these vectors cannot be too small. This proves a lower bound on the error probability of decoding and the upper bound announced in the title.

Original languageEnglish
Pages (from-to)1945-1961
Number of pages17
JournalIEEE Transactions on Information Theory
Issue number6
StatePublished - Sep 2000


  • Distance distribution
  • Error probability of decoding
  • Jacobi polynomials
  • Spherical codes


Dive into the research topics of 'A new upper bound on the reliability function of the gaussian channel'. Together they form a unique fingerprint.

Cite this