TY - CHAP
T1 - Asymptotically optimal blind separation of parametric gaussian sources
AU - Doron, Eran
AU - Yeredor, Arie
PY - 2004
Y1 - 2004
N2 - The second-order blind identification (SOBI) algorithm (Belouchrani et al., 1997) is a classical blind source separation (BSS) algorithm for stationary sources. The weights-adjusted SOBI (WASOBI) algorithm (Yeredor 2000) proposed a reformulation of the SOBI algorithm as a weighted nonlinear least squares problem, and showed how to obtain asymptotically optimal weights, under the assumption of Gaussian Moving Average (MA) sources. In this paper, we extend the framework by showing how to obtain the (asymptotically) optimal weight matrix also for the cases of auto-regressive (AR) or ARMA Gaussian sources (of unknown parameters), bypassing the apparent need for estimation of infinitely many correlation matrices. Comparison with other algorithms, with the Cramér Rao bound and with the analytically predicted performance is presented using simulations. In particular, we show that the optimal performance can be attained with fewer estimated correlation matrices than in the Gaussian Mutual Information approach (which is also optimal in this context).
AB - The second-order blind identification (SOBI) algorithm (Belouchrani et al., 1997) is a classical blind source separation (BSS) algorithm for stationary sources. The weights-adjusted SOBI (WASOBI) algorithm (Yeredor 2000) proposed a reformulation of the SOBI algorithm as a weighted nonlinear least squares problem, and showed how to obtain asymptotically optimal weights, under the assumption of Gaussian Moving Average (MA) sources. In this paper, we extend the framework by showing how to obtain the (asymptotically) optimal weight matrix also for the cases of auto-regressive (AR) or ARMA Gaussian sources (of unknown parameters), bypassing the apparent need for estimation of infinitely many correlation matrices. Comparison with other algorithms, with the Cramér Rao bound and with the analytically predicted performance is presented using simulations. In particular, we show that the optimal performance can be attained with fewer estimated correlation matrices than in the Gaussian Mutual Information approach (which is also optimal in this context).
UR - http://www.scopus.com/inward/record.url?scp=35048862993&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-30110-3_50
DO - 10.1007/978-3-540-30110-3_50
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.chapter???
AN - SCOPUS:35048862993
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 390
EP - 397
BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
A2 - Puntonet, Carlos G.
A2 - Prieto, Alberto
PB - Springer Verlag
ER -