Generating the support with extreme value losses
When optimizing against the mean loss over a distribution of predictions in the context of a regression task, then even if there is a distribution of targets the optimal prediction distribution is always a delta function at a single value. Methods of constructing generative models need to overcome this tendency. We consider a simple method of summarizing the prediction error, such that the optimal strategy corresponds to outputting a distribution of predictions with a support that matches the support of the distribution of targets --- optimizing against the minimal value of the loss given a set of samples from the prediction distribution, rather than the mean. We show that models trained against this loss learn to capture the support of the target distribution and, when combined with an auxiliary classifier-like prediction task, can be projected via rejection sampling to reproduce the full distribution of targets. The resulting method works well compared to other generative modeling approaches particularly in low dimensional spaces with highly non-trivial distributions, due to mode collapse solutions being globally suboptimal with respect to the extreme value loss. However, the method is less suited to high-dimensional spaces such as images due to the scaling of the number of samples needed in order to accurately estimate the extreme value loss when the dimension of the data manifold becomes large.
NurtureToken New!

Token crowdsale for this paper ends in

Buy Nurture Tokens

Author

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Nicholas Guttenberg (add twitter)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
User:
Stargazers:
0
Forks:
0
Open Issues:
0
Network:
0
Subscribers:
1
Language:
Jupyter Notebook
Code implementing generative models using 'extreme value loss'.
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
02/10/19 06:05PM
8,589
2,375
Tweets
yasuokajihei: RT @ngutten: The extreme value loss thing I've been discussing incessantly for the last four months: https://t.co/jOPs6qKayu Easy-to-train…
kanair: RT @ngutten: The extreme value loss thing I've been discussing incessantly for the last four months: https://t.co/jOPs6qKayu Easy-to-train…
GMartius: RT @ngutten: The extreme value loss thing I've been discussing incessantly for the last four months: https://t.co/jOPs6qKayu Easy-to-train…
arxivml: "Generating the support with extreme value losses", Nicholas Guttenberg https://t.co/WO79w7rKCM
ngutten: The extreme value loss thing I've been discussing incessantly for the last four months: https://t.co/jOPs6qKayu Easy-to-train generative models for low dimensional, multimodal problems.
arxiv_cs_LG: Generating the support with extreme value losses. Nicholas Guttenberg https://t.co/4w47QDygPT
BrundageBot: Generating the support with extreme value losses. Nicholas Guttenberg https://t.co/795sSCqxuo
Images
Related