TY - JOUR
T1 - Greedy-like algorithms for the cosparse analysis model
AU - Giryes, R.
AU - Nam, S.
AU - Elad, M.
AU - Gribonval, R.
AU - Davies, M. E.
N1 - Funding Information:
The authors would like to thank Jalal Fadili for fruitful discussion, and the unknown reviewers for the important remarks that helped to improved the shape of the paper. Without both of them, the examples of the optimal projections would not have appeared in the paper. This research was supported by (i) the New York Metropolitan Research Fund; (ii) the European Community’s FP7-ERC program, Grant Agreement No. 320649; (iii) the EU FP7, SMALL project, FET-Open Grant No. 225913; and (iv) the European Research Council, PLEASE project (ERC-StG-2011-277906). R. Giryes is grateful to the Azrieli Foundation for the award of an Azrieli Fellowship.
PY - 2014/1/15
Y1 - 2014/1/15
N2 - The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem - the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on ℓ1 relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedy-like methods - compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance.
AB - The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem - the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on ℓ1 relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedy-like methods - compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance.
KW - Analysis
KW - CoSaMP
KW - Compressed sensing
KW - Hard thresholding pursuit
KW - Iterative hard thresholding
KW - Sparse representations
KW - Subspace-pursuit
KW - Synthesis
UR - http://www.scopus.com/inward/record.url?scp=84889886185&partnerID=8YFLogxK
U2 - 10.1016/j.laa.2013.03.004
DO - 10.1016/j.laa.2013.03.004
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84889886185
SN - 0024-3795
VL - 441
SP - 22
EP - 60
JO - Linear Algebra and Its Applications
JF - Linear Algebra and Its Applications
ER -