Joint block diagonalization algorithms for optimal separation of multidimensional components

Dana Lahat*, Jean François Cardoso, Hagit Messer

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper deals with non-orthogonal joint block diagonalization. Two algorithms which minimize the Kullback-Leibler divergence between a set of real positive-definite matrices and a block-diagonal transformation thereof are suggested. One algorithm is based on the relative gradient, and the other is based on a quasi-Newton method. These algorithms allow for the optimal, in the mean square error sense, blind separation of multidimensional Gaussian components. Simulations demonstrate the convergence properties of the suggested algorithms, as well as the dependence of the criterion on some of the model parameters.

Original languageEnglish
Title of host publicationLatent Variable Analysis and Signal Separation - 10th International Conference, LVA/ICA 2012, Proceedings
Pages155-162
Number of pages8
DOIs
StatePublished - 2012
Event10th International Conference on Latent Variable Analysis and Signal Separation, LVA/ICA 2012 - Tel Aviv, Israel
Duration: 12 Mar 201215 Mar 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7191 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference10th International Conference on Latent Variable Analysis and Signal Separation, LVA/ICA 2012
Country/TerritoryIsrael
CityTel Aviv
Period12/03/1215/03/12

Keywords

  • Joint block diagonalization
  • quasi-Newton
  • relative gradient

Fingerprint

Dive into the research topics of 'Joint block diagonalization algorithms for optimal separation of multidimensional components'. Together they form a unique fingerprint.

Cite this