A Tutorial on Bayesian Optimization
Bayesian optimization is an approach to optimizing objective functions that take a long time (minutes or hours) to evaluate. It is best-suited for optimization over continuous domains of less than 20 dimensions, and tolerates stochastic noise in function evaluations. It builds a surrogate for the objective and quantifies the uncertainty in that surrogate using a Bayesian machine learning technique, Gaussian process regression, and then uses an acquisition function defined from this surrogate to decide where to sample. In this tutorial, we describe how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient. We then discuss more advanced techniques, including running multiple function evaluations in parallel, multi-fidelity and multi-information source optimization, expensive-to-evaluate constraints, random environmental conditions, multi-task Bayesian optimization, and the inclusion of derivative information. We conclude with a discussion of Bayesian optimization software and future research directions in the field. Within our tutorial material we provide a generalization of expected improvement to noisy evaluations, beyond the noise-free setting where it is more commonly applied. This generalization is justified by a formal decision-theoretic argument, standing in contrast to previous ad hoc modifications.
NurtureToken New!

Token crowdsale for this paper ends in

Buy Nurture Tokens

Author

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Peter I. Frazier (add twitter)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
User:
None (add)
Repo:
None (add)
Stargazers:
0
Forks:
0
Open Issues:
0
Network:
0
Subscribers:
0
Language:
None
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
07/09/18 06:53PM
13,648
3,338
Tweets
jarotter: Miren esta joya de paquete para optimización bayesiana https://t.co/aDXV0q6cg6 que viene con un gran tutorial https://t.co/RlHmq4h9WC
scasa1925: RT @GoAbiAryan: An excellent comprehensive introduction to Bayesian Optimization https://t.co/CWyiI0YytT
flapdoodle_sand: RT @GoAbiAryan: An excellent comprehensive introduction to Bayesian Optimization https://t.co/CWyiI0YytT
the_dismal_tide: RT @GoAbiAryan: An excellent comprehensive introduction to Bayesian Optimization https://t.co/CWyiI0YytT
EldarSilver: RT @GoAbiAryan: An excellent comprehensive introduction to Bayesian Optimization https://t.co/CWyiI0YytT
GoAbiAryan: An excellent comprehensive introduction to Bayesian Optimization https://t.co/CWyiI0YytT
ak_eapen: A Tutorial on Bayesian Optimization https://t.co/6kXeN2bGGK
treasured_write: RT @YadKonrad: Neat tutorial on Bayesian Optimization "A Tutorial on Bayesian Optimization" https://t.co/DgCEWh7WjZ https://t.co/nkZEOEB9zQ
YadKonrad: Neat tutorial on Bayesian Optimization "A Tutorial on Bayesian Optimization" https://t.co/DgCEWh7WjZ https://t.co/nkZEOEB9zQ
remykarem: A Tutorial on Bayesian Optimization by Frazier. https://t.co/j8foAbxFIq https://t.co/7k4Y9f2rYs
conormacd: [1807.02811] A Tutorial on Bayesian Optimization ☝️ https://t.co/gIAxQp0VCv
AcerbiLuigi: @cian_neuro @neurograce Bayesian optimization is *exactly* what you want! Spearmint implements the famous NeurIPS paper from 2012 (+ some other tweaks) which brought BO to the forefront in ML, and might be a good starting point. For a recent review, see here: https://t.co/1WH2DphJjK
Memoirs: A Tutorial on Bayesian Optimization. https://t.co/xEQzF8Ov23
Images
Related