Abstract
We consider decoding of binary linear Tanner codes using message-passing iterative decoding and linear-programming (LP) decoding in memoryless binary-input output-symmetric (MBIOS) channels. We present new certificates that are based on a combinatorial characterization for the local optimality of a codeword in irregular Tanner codes with respect to any MBIOS channel. This characterization is a generalization of (Arora , Proc. ACM Symp. Theory of Computing, 2009) and (Vontobel, Proc. Inf. Theory and Appl. Workshop, 2010) and is based on a conical combination of normalized weighted subtrees in the computation trees of the Tanner graph. These subtrees may have any finite height $h$ (even equal or greater than half of the girth of the Tanner graph). In addition, the degrees of local-code nodes in these subtrees are not restricted to two (i.e., these subtrees are not restricted to skinny trees). We prove that local optimality in this new characterization implies maximum-likelihood (ML) optimality and LP optimality, and show that a certificate can be computed efficiently. We also present a new message-passing iterative decoding algorithm, called normalized weighted min-sum (NWMS). NWMS decoding is a belief-propagation (BP) type algorithm that applies to any irregular binary Tanner code with single parity-check local codes (e.g., low-density and high-density parity-check codes). We prove that if a locally optimal codeword with respect to height parameter $h$ exists (whereby notably $h$ is not limited by the girth of the Tanner graph), then NWMS decoding finds this codeword in $h$ iterations. The decoding guarantee of the NWMS decoding algorithm applies whenever there exists a locally optimal codeword. Because local optimality of a codeword implies that it is the unique ML codeword, the decoding guarantee also provides an ML certificate for this codeword. Finally, we apply the new local-optimality characterization to regular Tanner codes, and prove lower bounds on the noise thresholds of LP decoding in MBIOS channels. When the noise is below these lower bounds, the probability that LP decoding fails to decode the transmitted codeword decays doubly exponentially in the girth of the Tanner graph.
Original language | English |
---|---|
Article number | 6626647 |
Pages (from-to) | 191-211 |
Number of pages | 21 |
Journal | IEEE Transactions on Information Theory |
Volume | 60 |
Issue number | 1 |
DOIs | |
State | Published - Jan 2014 |
Keywords
- Error bounds
- Tanner codes
- factor graphs
- graph cover
- ldpc codes
- linear programming (lp) decoding
- local optimality
- max-product algorithm
- maximum-likelihood (ML) certificate
- memoryless binary-input output-symmetric (mbios) channel
- message-passing algorithms
- min-sum algorithm
- thresholds