TY - GEN
T1 - Asymptotic MMSE analysis under sparse representation modeling
AU - Huleihel, Wasim
AU - Merhav, Neri
PY - 2014
Y1 - 2014
N2 - Compressed sensing is a signal processing technique in which data is acquired directly in a compressed form. There are two modeling approaches that can be considered: the worst-case (Hamming) approach and a statistical mechanism, in which the signals are modeled as random processes rather than as individual sequences. In this paper, the second approach is studied. Accordingly, we consider a model of the form Y = HX +W, where each component of X is given by Xi = SiUi, where {Ui} are i.i.d. Gaussian random variables, and {Si} are binary random variables independent of {Ui{, and not necessarily independent and identically distributed (i.i.d.), H ε ℝk×n is a random matrix with i.i.d. entries, and W is white Gaussian noise. Using a direct relationship between optimum estimation and certain partition functions, and by invoking methods from statistical mechanics and from random matrix theory, we derive an asymptotic formula for the minimum mean-square error (MMSE) of estimating the input vector X given Y and H, as k, n → ∞, keeping the measurement rate, R = k/n, fixed. In contrast to previous derivations, which are based on the replica method, the analysis carried in this paper is rigorous. In contrast to previous works in which only memoryless sources were considered, we consider a more general model which allows a certain structured dependency among the various components of the source.
AB - Compressed sensing is a signal processing technique in which data is acquired directly in a compressed form. There are two modeling approaches that can be considered: the worst-case (Hamming) approach and a statistical mechanism, in which the signals are modeled as random processes rather than as individual sequences. In this paper, the second approach is studied. Accordingly, we consider a model of the form Y = HX +W, where each component of X is given by Xi = SiUi, where {Ui} are i.i.d. Gaussian random variables, and {Si} are binary random variables independent of {Ui{, and not necessarily independent and identically distributed (i.i.d.), H ε ℝk×n is a random matrix with i.i.d. entries, and W is white Gaussian noise. Using a direct relationship between optimum estimation and certain partition functions, and by invoking methods from statistical mechanics and from random matrix theory, we derive an asymptotic formula for the minimum mean-square error (MMSE) of estimating the input vector X given Y and H, as k, n → ∞, keeping the measurement rate, R = k/n, fixed. In contrast to previous derivations, which are based on the replica method, the analysis carried in this paper is rigorous. In contrast to previous works in which only memoryless sources were considered, we consider a more general model which allows a certain structured dependency among the various components of the source.
UR - http://www.scopus.com/inward/record.url?scp=84906568548&partnerID=8YFLogxK
U2 - 10.1109/ISIT.2014.6875311
DO - 10.1109/ISIT.2014.6875311
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:84906568548
SN - 9781479951864
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 2634
EP - 2638
BT - 2014 IEEE International Symposium on Information Theory, ISIT 2014
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 IEEE International Symposium on Information Theory, ISIT 2014
Y2 - 29 June 2014 through 4 July 2014
ER -