TY - JOUR
T1 - Improved bounds on the word error probability of RA(2) codes with linear-programming-based decoding
AU - Halabi, Nissim
AU - Even, Guy
PY - 2005/1
Y1 - 2005/1
N2 - This paper deals with the linear-programming-based decoding algorithm of Feldman and Karger for repeat-accumulate "turbo-like" codes. We present a new structural characterization that captures the event that decoding fails. Based on this structural characterization, we develop polynomial algorithms that, given an RA(2) code, compute upper and lower bounds on the word error probability Pw for the binary-symmetric and the additive white Gaussian noise (AWGN) channels. Our experiments with an implementation of these algorithms for bounding Pw demonstrate in many interesting cases an improvement in the upper bound on the word error probability by a factor of over 1000 compared to the bounds by Feldman et al.. The experiments also indicate that the improvement in upper bound increases as the codeword length increases and the channel noise decreases. The computed lower bounds on the word error probability in our experiments are roughly ten times smaller than the upper bound.
AB - This paper deals with the linear-programming-based decoding algorithm of Feldman and Karger for repeat-accumulate "turbo-like" codes. We present a new structural characterization that captures the event that decoding fails. Based on this structural characterization, we develop polynomial algorithms that, given an RA(2) code, compute upper and lower bounds on the word error probability Pw for the binary-symmetric and the additive white Gaussian noise (AWGN) channels. Our experiments with an implementation of these algorithms for bounding Pw demonstrate in many interesting cases an improvement in the upper bound on the word error probability by a factor of over 1000 compared to the bounds by Feldman et al.. The experiments also indicate that the improvement in upper bound increases as the codeword length increases and the channel noise decreases. The computed lower bounds on the word error probability in our experiments are roughly ten times smaller than the upper bound.
KW - Linear-programming-based decoding
KW - Repeat-accumulate codes
KW - Turbo codes
KW - Word error rate
UR - http://www.scopus.com/inward/record.url?scp=12444264912&partnerID=8YFLogxK
U2 - 10.1109/TIT.2004.839509
DO - 10.1109/TIT.2004.839509
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:12444264912
SN - 0018-9448
VL - 51
SP - 265
EP - 280
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 1
ER -