UNSUPERVISED DISENTANGLEMENT WITH TENSOR PRODUCT REPRESENTATIONS ON THE TORUS

Michael Rotman, Amit Dekel, Shir Gur, Yaron Oz, Lior Wolf

Research output: Contribution to conferencePaperpeer-review

3 Scopus citations

Abstract

The current methods for learning representations with auto-encoders almost exclusively employ vectors as the latent representations. In this work, we propose to employ a tensor product structure for this purpose. This way, the obtained representations are naturally disentangled. In contrast to the conventional variations methods, which are targeted toward normally distributed features, the latent space in our representation is distributed uniformly over a set of unit circles. We argue that the torus structure of the latent space captures the generative factors effectively. We employ recent tools for measuring unsupervised disentanglement, and in an extensive set of experiments demonstrate the advantage of our method in terms of disentanglement, completeness, and informativeness. The code for our proposed method is available at https://github.com/rotmanmi/Unsupervised-Disentanglement-Torus.

Original languageEnglish
StatePublished - 2022
Event10th International Conference on Learning Representations, ICLR 2022 - Virtual, Online
Duration: 25 Apr 202229 Apr 2022

Conference

Conference10th International Conference on Learning Representations, ICLR 2022
CityVirtual, Online
Period25/04/2229/04/22

Funding

FundersFunder number
Israeli Science Foundation center of excellence
European Commission
Horizon 2020ERC CoG 725974

    Fingerprint

    Dive into the research topics of 'UNSUPERVISED DISENTANGLEMENT WITH TENSOR PRODUCT REPRESENTATIONS ON THE TORUS'. Together they form a unique fingerprint.

    Cite this