TY - JOUR
T1 - On the Effective Measure of Dimension in the Analysis Cosparse Model
AU - Giryes, Raja
AU - Plan, Yaniv
AU - Vershynin, Roman
N1 - Publisher Copyright:
© 1963-2012 IEEE.
PY - 2015/10/1
Y1 - 2015/10/1
N2 - Many applications have benefited remarkably from low-dimensional models in the recent decade. The fact that many signals, though high dimensional, are intrinsically low dimensional has given the possibility to recover them stably from a relatively small number of their measurements. For example, in compressed sensing with the standard (synthesis) sparsity prior and in matrix completion, the number of measurements needed is proportional (up to a logarithmic factor) to the signal's manifold dimension. Recently, a new natural low-dimensional signal model has been proposed: the cosparse analysis prior. In the noiseless case, it is possible to recover signals from this model, using a combinatorial search, from a number of measurements proportional to the signal's manifold dimension. However, if we ask for stability to noise or an efficient (polynomial complexity) solver, all the existing results demand a number of measurements, which is far removed from the manifold dimension, sometimes far greater. Thus, it is natural to ask whether this gap is a deficiency of the theory and the solvers, or if there exists a real barrier in recovering the cosparse signals by relying only on their manifold dimension. Is there an algorithm which, in the presence of noise, can accurately recover a cosparse signal from a number of measurements proportional to the manifold dimension? In this paper, we prove that there is no such algorithm. Furthermore, we show through the numerical simulations that even in the noiseless case convex relaxations fail when the number of measurements is comparable with the manifold dimension. This gives a practical counterexample to the growing literature on the compressed acquisition of signals based on manifold dimension.
AB - Many applications have benefited remarkably from low-dimensional models in the recent decade. The fact that many signals, though high dimensional, are intrinsically low dimensional has given the possibility to recover them stably from a relatively small number of their measurements. For example, in compressed sensing with the standard (synthesis) sparsity prior and in matrix completion, the number of measurements needed is proportional (up to a logarithmic factor) to the signal's manifold dimension. Recently, a new natural low-dimensional signal model has been proposed: the cosparse analysis prior. In the noiseless case, it is possible to recover signals from this model, using a combinatorial search, from a number of measurements proportional to the signal's manifold dimension. However, if we ask for stability to noise or an efficient (polynomial complexity) solver, all the existing results demand a number of measurements, which is far removed from the manifold dimension, sometimes far greater. Thus, it is natural to ask whether this gap is a deficiency of the theory and the solvers, or if there exists a real barrier in recovering the cosparse signals by relying only on their manifold dimension. Is there an algorithm which, in the presence of noise, can accurately recover a cosparse signal from a number of measurements proportional to the manifold dimension? In this paper, we prove that there is no such algorithm. Furthermore, we show through the numerical simulations that even in the noiseless case convex relaxations fail when the number of measurements is comparable with the manifold dimension. This gives a practical counterexample to the growing literature on the compressed acquisition of signals based on manifold dimension.
KW - Compressed sensing
KW - manifold dimension
KW - sparse representations
KW - the analysis model
KW - total variation
UR - http://www.scopus.com/inward/record.url?scp=84948718983&partnerID=8YFLogxK
U2 - 10.1109/TIT.2015.2466597
DO - 10.1109/TIT.2015.2466597
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84948718983
SN - 0018-9448
VL - 61
SP - 5745
EP - 5753
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 10
M1 - 7185448
ER -