Iterative approximate linear programming decoding of LDPC codes with linear complexity

David Burshtein*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

50 Scopus citations

Abstract

The problem of low complexity linear programming (LP) decoding of low-density parity-check (LDPC) codes is considered. An iterative algorithm, similar to min-sum and belief propagation, for efficient approximate solution of this problem was proposed by Vontobel and Koetter. In this paper, the convergence rate and computational complexity of this algorithm are studied using a scheduling scheme that we propose. In particular, we are interested in obtaining a feasible vector in the LP decoding problem that is close to optimal in the following sense. The distance, normalized by the block length, between the minimum and the objective function value of this approximate solution can be made arbitrarily small. It is shown that such a feasible vector can be obtained with a computational complexity which scales linearly with the block length. Combined with previous results that have shown that the LP decoder can correct some fixed fraction of errors we conclude that this error correction can be achieved with linear computational complexity. This is achieved by first applying the iterative LP decoder that decodes the correct transmitted codeword up to an arbitrarily small fraction of erroneous bits, and then correcting the remaining errors using some standard method. These conclusions are also extended to generalized LDPC codes.

Original languageEnglish
Pages (from-to)4835-4859
Number of pages25
JournalIEEE Transactions on Information Theory
Volume55
Issue number11
DOIs
StatePublished - 2009

Funding

FundersFunder number
Israel Science Foundation927/05

    Keywords

    • Iterative decoding
    • Linear programming decoding
    • Low-density parity-check (LDPC) codes

    Fingerprint

    Dive into the research topics of 'Iterative approximate linear programming decoding of LDPC codes with linear complexity'. Together they form a unique fingerprint.

    Cite this