Regularized Evolution for Image Classifier Architecture Search
The effort devoted to hand-crafting image classifiers has motivated the use of architecture search to discover them automatically. Although evolutionary algorithms have been repeatedly applied to architecture search, the architectures thus discovered have remained inferior to human-crafted ones. Here we show for the first time that artificially-evolved architectures can match or surpass human-crafted and RL-designed image classifiers. In particular, our models---named AmoebaNets---achieved a state-of-the-art accuracy of 97.87% on CIFAR-10 and top-1 accuracy of 83.1% on ImageNet. Among mobile-size models, an AmoebaNet with only 5.1M parameters also achieved a state-of-the-art top-1 accuracy of 75.1% on ImageNet. We also compared this method against strong baselines. Finally, we performed platform-aware architecture search with evolution to find a model that trains quickly on Google Cloud TPUs. This method produced an AmoebaNet that won the Stanford DAWNBench competition for lowest ImageNet training cost.
NurtureToken New!

Token crowdsale for this paper ends in

Buy Nurture Tokens

Authors

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Esteban Real (add twitter)
Alok Aggarwal (add twitter)
Yanping Huang (add twitter)
Quoc V Le (add twitter)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
User:
None (add)
Repo:
None (add)
Stargazers:
0
Forks:
0
Open Issues:
0
Network:
0
Subscribers:
0
Language:
None
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
06/25/18 06:32PM
9,483
3,081
Tweets
kuritateppei: 進化的アルゴリズム(EA)は、今まで数多くNNに適用されてきたが人間が設計した画像分類器には劣っていた。この研究では初めて人が設計したモデルの性能を超えた。キモはより若い遺伝子を優先するようにトーナメント選択アルゴリズムを改良したところ。画像は獲得されたアーキ。 https://t.co/YzrWsj4WWk https://t.co/QS39Z06VPA
janhjensen: Regularized Evolution for Image Classifier Architecture Search https://t.co/btADzLvpGb
rfrsarmiento: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
rfrsarmiento: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
sampathweb: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
DongWan42101620: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
noorrocks: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
mmartin_DS: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
sinngh_deepak: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
TFBauchi: RT @DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK…
DillonLaird: A fun series of “AutoML” papers to go through with corresponding code: AmoebaNet https://t.co/dpeqjgJT4J https://t.co/rLK2vLJav3 MnasNet https://t.co/LyPmbP6d1z https://t.co/Y3MNDY2INY NAS-FPN https://t.co/XALdtsjmiI Auto-DeepLab https://t.co/QmiTjG5boU https://t.co/DaaJPjFtek
ajlopez: Regularized Evolution for Image Classifier Architecture Search https://t.co/kFJqdsdfJ3
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
johndburger: @ly_yng @Noahpinion It’s not an either-or situation. Plenty of work on evolutionary approaches to designing deep learning architectures, e.g. https://t.co/Yd9ZpMTzCM
imenurok: EncapNetと比較すべきなのはAmoebaNet (https://t.co/nXQacnQbso)とかShake-Shake (https://t.co/UxZ7LD0wRM)だと思う
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
nishitian: MetaLearning AmoebaNet https://t.co/RYVLQTQadK
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6yCIr
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6h1QT
arxiv_cscv: Regularized Evolution for Image Classifier Architecture Search https://t.co/5Fawg6yCIr
Swall0wTech: [1802.01548] Regularized Evolution for Image Classifier Architecture Search https://t.co/CrkvV1j3fL
rinakatase: RT @imenurok: @_tkato_ arXiv上には知る限り、例のヤツより精度が上回る手法が3つあります。 https://t.co/nXQacnQbso ニューラル構造の進化型計算AmoebaNet https://t.co/ldcyXOvkR4 ImageNe…
negi111111: RT @imenurok: @_tkato_ arXiv上には知る限り、例のヤツより精度が上回る手法が3つあります。 https://t.co/nXQacnQbso ニューラル構造の進化型計算AmoebaNet https://t.co/ldcyXOvkR4 ImageNe…
jaguring1: RT @imenurok: @_tkato_ arXiv上には知る限り、例のヤツより精度が上回る手法が3つあります。 https://t.co/nXQacnQbso ニューラル構造の進化型計算AmoebaNet https://t.co/ldcyXOvkR4 ImageNe…
_tkato_: RT @imenurok: @_tkato_ arXiv上には知る限り、例のヤツより精度が上回る手法が3つあります。 https://t.co/nXQacnQbso ニューラル構造の進化型計算AmoebaNet https://t.co/ldcyXOvkR4 ImageNe…
imenurok: @_tkato_ arXiv上には知る限り、例のヤツより精度が上回る手法が3つあります。 https://t.co/nXQacnQbso ニューラル構造の進化型計算AmoebaNet https://t.co/ldcyXOvkR4 ImageNetで学習したNASNet-A Largeの転移学習 https://t.co/ErZR2IR5Py 画像前処理の強化学習+ShakeDrop (知る限り今のSoTA、Err=1.48%)
Swall0wTech: [1802.01548] Regularized Evolution for Image Classifier Architecture Search https://t.co/CrkvV1j3fL
Images
Related