Triple consistency loss for pairing distributions in GAN-based face synthesis
Generative Adversarial Networks have shown impressive results for the task of object translation, including face-to-face translation. A key component behind the success of recent approaches is the self-consistency loss, which encourages a network to recover the original input image when the output generated for a desired attribute is itself passed through the same network, but with the target attribute inverted. While the self-consistency loss yields photo-realistic results, it can be shown that the input and target domains, supposed to be close, differ substantially. This is empirically found by observing that a network recovers the input image even if attributes other than the inversion of the original goal are set as target. This stops one combining networks for different tasks, or using a network to do progressive forward passes. In this paper, we show empirical evidence of this effect, and propose a new loss to bridge the gap between the distributions of the input and target domains. This "triple consistency loss", aims to minimise the distance between the outputs generated by the network for different routes to the target, independent of any intermediate steps. To show this is effective, we incorporate the triple consistency loss into the training of a new landmark-guided face to face synthesis, where, contrary to previous works, the generated images can simultaneously undergo a large transformation in both expression and pose. To the best of our knowledge, we are the first to tackle the problem of mismatching distributions in self-domain synthesis, and to propose "in-the-wild" landmark-guided synthesis. Code will be available at https://github.com/ESanchezLozano/GANnotation
Authors

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Enrique Sanchez (add twitter)
Michel Valstar (edit)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
Stargazers:
4
Forks:
1
Open Issues:
0
Network:
1
Subscribers:
3
Language:
None
face to face synthesis using GANs
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
11/08/18 06:02PM
6,700
1,850
Tweets
DimitrisMallis: RT @arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1…
ESanchezLozano: RT @arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1…
Rosenchild: RT @arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1…
Zahid_Akhtar: RT @arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1…
shubh_300595: RT @arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1…
ESanchezLozano: https://t.co/fwGIJYKEMM #GAN #face #Pytorch A landmark-guided face-to-face synthesis network (with a triple consistency loss) https://t.co/CClahlCcez (training code also coming soon) https://t.co/T2l1UW5o7Q
syoyo: RT @arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1…
arxiv_org: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/IEcP7Oh6x1 https://t.co/KTT45JP1js
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOkgFe
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOkgFe
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOkgFe
ComputerPapers: Triple consistency loss for pairing distributions in GAN-based face synthesis. https://t.co/ia4lpLtsNH
udoooom: RT @arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOBRwM
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOkgFe
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOBRwM
arxivml: "Triple consistency loss for pairing distributions in GAN-based face synthesis", Enrique Sanchez, Michel Valstar https://t.co/yiGElAP72U
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOkgFe
_asjackson: https://t.co/rkapBQuyjh
arxiv_cscv: Triple consistency loss for pairing distributions in GAN-based face synthesis https://t.co/oqqnpOkgFe
BrundageBot: Triple consistency loss for pairing distributions in GAN-based face synthesis. Enrique Sanchez and Michel Valstar https://t.co/kf1WTFFKrn
Images
Related