MintNet: Building Invertible Neural Networks with Masked Convolutions
We propose a new way of constructing invertible neural networks by combining simple building blocks with a novel set of composition rules. This leads to a rich set of invertible architectures, including those similar to ResNets. Inversion is achieved with a locally convergent iterative procedure that is parallelizable and very fast in practice. Additionally, the determinant of the Jacobian can be computed analytically and efficiently, enabling their generative use as flow models. To demonstrate their flexibility, we show that our invertible neural networks are competitive with ResNets on MNIST and CIFAR-10 classification. When trained as generative models, our invertible networks achieve new state-of-the-art likelihoods on MNIST, CIFAR-10 and ImageNet 32x32, with bits per dimension of 0.98, 3.32 and 4.06 respectively.
NurtureToken New!

Token crowdsale for this paper ends in

Buy Nurture Tokens

Authors

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Yang Song (edit)
Chenlin Meng (add twitter)
Stefano Ermon (edit)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
User:
None (add)
Repo:
None (add)
Stargazers:
0
Forks:
0
Open Issues:
0
Network:
0
Subscribers:
0
Language:
None
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
07/18/19 06:03PM
8,701
2,108
Tweets
tru1mprssn: Approx ~: Building an Invertible Neural Network with Masked Convolutions. Connecting the Dots. https://t.co/jJoCu1PENH Making something get predicted for Future Output, in a much more Complex non-linear way One Slip in the Wires, you will Not Know which Direction to Go https://t.co/dBc9JM7RLW
arxiv_pop: 2019/07/18 投稿 1位 LG(Machine Learning) MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/92dlIe3j7y 18 Tweets 29 Retweets 166 Favorites
hillbig: MintNet is a new invertible neural network using (causal) masked convolution, and can estimate the exact likelihood efficiently. With a new fixed-point iteration method, it can compute the inverse. New SOTA on density estimation tasks https://t.co/eHC1yFXids
hillbig: 可逆変換NNとして、ヤコビアンが三角行列となりその行列式が高速に求められる(Causal) Masked Convolutionを使ったMintNetを提案。逆変換は新しい不動点反復法で実現。強力な変換を使いながら正確な密度推定が可能。密度推定のSOTA。https://t.co/eHC1yFXids
dcpwebdesigners: RT @re_mahmoudi: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/XluRe4LWYa #DeepLearning #ai #Machi…
re_mahmoudi: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/XluRe4LWYa #DeepLearning #ai #MachineLearning #ArtificialIntelligence https://t.co/NBLnc4KuKt
rezamahmooudi: توصیه میکنم حتما این مقاله رو بخونید MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/HV9fpMrVCr #DeepLearning #ai #MachineLearning #ArtificailIntelligence https://t.co/xmVtyv3G6n
muktabh: RT @arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
udmrzn: RT @arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfO49Xb
keylinker: RT @arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
__nggih: RT @arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
straybrid: RT @arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
shubh_300595: RT @arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
_psyguy: RT @arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
arxiv_org: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/AKhWLVtTV2 https://t.co/3ah2rxCJ3f
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfO49Xb
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfNMz5D
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfNMz5D
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfNMz5D
markomanka: MintNet: Building invertible neural networks with masked convolutions https://t.co/lGEgWfXvFJ https://t.co/k7hja5zE6a
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfNMz5D
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfNMz5D
StatsPapers: MintNet: Building Invertible Neural Networks with Masked Convolutions. https://t.co/hn3pxQzLfm
YSongStanford: Joint work with Chenlin Meng and @ermonste. Paper link: https://t.co/TFZIUMwHfg
arxiv_cscv: MintNet: Building Invertible Neural Networks with Masked Convolutions https://t.co/Hq1xfNMz5D
arxivml: "MintNet: Building Invertible Neural Networks with Masked Convolutions", Yang Song, Chenlin Meng, Stefano Ermon https://t.co/9amPEfb0g9
arxiv_cs_LG: MintNet: Building Invertible Neural Networks with Masked Convolutions. Yang Song, Chenlin Meng, and Stefano Ermon https://t.co/2cNrhyijsc
BrundageBot: MintNet: Building Invertible Neural Networks with Masked Convolutions. Yang Song, Chenlin Meng, and Stefano Ermon https://t.co/alTMuqf8i6
Images
Related