CrossNet: Latent cross-consistency for unpaired image translation

Omry Sendik, Dani Lischinski, Daniel Cohen-Or

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recent GAN-based architectures have been able to deliver impressive performance on the general task of image-to-image translation. In particular, it was shown that a wide variety of image translation operators may be learned from two image sets, containing images from two different domains, without establishing an explicit pairing between the images. This was made possible by introducing clever regularizers to overcome the under-constrained nature of the unpaired translation problem.In this work, we introduce a novel architecture for unpaired image translation, and explore several new regularizes enabled by it. Specifically, our architecture comprises a pair of GANs, as well as a pair of translators between their respective latent spaces. These cross-translators enable us to impose several regularizing constraints on the learnt image translation operator, collectively referred to as latent cross-consistency. Our results show that our proposed architecture and latent cross-consistency constraints are able to outperform the existing state-of-the-art on a variety of image translation tasks.

Original languageEnglish
Title of host publicationProceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3032-3040
Number of pages9
ISBN (Electronic)9781728165530
DOIs
StatePublished - Mar 2020
Event2020 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2020 - Snowmass Village, United States
Duration: 1 Mar 20205 Mar 2020

Publication series

NameProceedings - 2020 IEEE Winter Conference on Applications of Computer Vision, WACV 2020

Conference

Conference2020 IEEE/CVF Winter Conference on Applications of Computer Vision, WACV 2020
Country/TerritoryUnited States
CitySnowmass Village
Period1/03/205/03/20

Cite this