Publication | Open Access
Unpaired Image-to-Image Translation Using Cycle-Consistent Adversarial Networks
21.3K
Citations
47
References
2017
Year
Unknown Venue
EngineeringMachine LearningImage-to-image TranslationStyle TransferImage AnalysisData SciencePattern RecognitionImage-based ModelingGenerative ModelComputational ImagingRobot LearningMachine TranslationSynthetic Image GenerationMachine VisionComputer ScienceDeep LearningComputer VisionOutput ImageGenerative Adversarial NetworkCycle Consistency Loss
Image‑to‑image translation typically relies on aligned image pairs, but many real‑world tasks lack such paired training data. This work proposes a method to translate images from a source domain X to a target domain Y without requiring paired examples. The approach learns a mapping G using an adversarial loss and enforces cycle consistency by coupling it with an inverse mapping F, so that F(G(X))≈X and G(F(Y))≈Y. Qualitative experiments on style transfer, object transfiguration, season transfer, and photo enhancement, along with quantitative comparisons, demonstrate that the method outperforms previous approaches.
Image-to-image translation is a class of vision and graphics problems where the goal is to learn the mapping between an input image and an output image using a training set of aligned image pairs. However, for many tasks, paired training data will not be available. We present an approach for learning to translate an image from a source domain X to a target domain Y in the absence of paired examples. Our goal is to learn a mapping G : X → Y such that the distribution of images from G(X) is indistinguishable from the distribution Y using an adversarial loss. Because this mapping is highly under-constrained, we couple it with an inverse mapping F : Y → X and introduce a cycle consistency loss to push F(G(X)) ≈ X (and vice versa). Qualitative results are presented on several tasks where paired training data does not exist, including collection style transfer, object transfiguration, season transfer, photo enhancement, etc. Quantitative comparisons against several prior methods demonstrate the superiority of our approach.
| Year | Citations | |
|---|---|---|
2016 | 214.9K | |
2014 | 75.4K | |
2015 | 39.5K | |
2015 | 36.2K | |
2017 | 21.7K | |
2006 | 20.5K | |
2013 | 15.5K | |
1970 | 12.7K | |
2017 | 12K | |
2016 | 11.5K |
Page 1
Page 1