Transflow Learning: Repurposing Flow Models Without Retraining
It is well known that deep generative models have a rich latent space, and that it is possible to smoothly manipulate their outputs by traversing this latent space. Recently, architectures have emerged that allow for more complex manipulations, such as making an image look as though it were from a different class, or painted in a certain style. These methods typically require large amounts of training in order to learn a single class of manipulations. We present Transflow Learning, a method for transforming a pre-trained generative model so that its outputs more closely resemble data that we provide afterwards. In contrast to previous methods, Transflow Learning does not require any training at all, and instead warps the probability distribution from which we sample latent vectors using Bayesian inference. Transflow Learning can be used to solve a wide variety of tasks, such as neural style transfer and few-shot classification.
NurtureToken New!

Token crowdsale for this paper ends in

Buy Nurture Tokens

Authors

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Andrew Gambardella (edit)
Atılım Güneş Baydin (add twitter)
Philip H. S. Torr (add twitter)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
User:
None (add)
Repo:
None (add)
Stargazers:
0
Forks:
0
Open Issues:
0
Network:
0
Subscribers:
0
Language:
None
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
12/01/19 06:02PM
5,951
1,830
Tweets
arxiv_cscv: Transflow Learning: Repurposing Flow Models Without Retraining https://t.co/wdNdIKZhQ3
arxiv_cscv: Transflow Learning: Repurposing Flow Models Without Retraining https://t.co/wdNdIKZhQ3
gambsgambs: @serrjoa @shakir_za @DeepSpiker @gpapamak @eric_nalisnick @balajiln WRT the second paper: really cool! I also noticed the complexity bias effect in https://t.co/fw0KCviq5S where I was able to condition a flow model on images with monochromatic occlusions, but not noisy occlusions (figure 6)
tmasada: [1911.13270] Transflow Learning: Repurposing Flow Models Without Retraining https://t.co/h6vk1C0qKs
reddit_ml: [R] Transflow Learning: Repurposing Flow Models Without Retraining https://t.co/MHxLeSKtEb
gambsgambs: Ever feel your waifu was a little too 2D? Turn her into a real human with Transflow Learning! No training required! https://t.co/Wr9ltpzTFm https://t.co/lmKujSqG7b
Pol09122455: RT @roadrunning01: Transflow Learning: Repurposing Flow Models Without Retraining pdf: https://t.co/tFPzKogJ06 abs: https://t.co/eGiVMUJmWY…
StatsPapers: Transflow Learning: Repurposing Flow Models Without Retraining. https://t.co/HzxK6rL4DR
arxivml: "Transflow Learning: Repurposing Flow Models Without Retraining", Andrew Gambardella, Atılım Güneş Baydin, Philip H… https://t.co/Xuk54V0Zrh
jainnitk: RT @roadrunning01: Transflow Learning: Repurposing Flow Models Without Retraining pdf: https://t.co/tFPzKogJ06 abs: https://t.co/eGiVMUJmWY…
deepgradient: RT @roadrunning01: Transflow Learning: Repurposing Flow Models Without Retraining pdf: https://t.co/tFPzKogJ06 abs: https://t.co/eGiVMUJmWY…
Luck2john: RT @roadrunning01: Transflow Learning: Repurposing Flow Models Without Retraining pdf: https://t.co/tFPzKogJ06 abs: https://t.co/eGiVMUJmWY…
arxiv_cs_LG: Transflow Learning: Repurposing Flow Models Without Retraining. Andrew Gambardella, Atılım Güneş Baydin, and Philip H. S. Torr https://t.co/uMXyXulZ43
roadrunning01: Transflow Learning: Repurposing Flow Models Without Retraining pdf: https://t.co/tFPzKogJ06 abs: https://t.co/eGiVMUJmWY https://t.co/q6GmgCmJpK
Images
Related