作者
José Jiménez, Josep Ginebra
发表日期
2017/10
期刊
Journal of Open Source Software
卷号
2
期号
19
页码范围
431
简介
Bayesian optimization has risen over the last few years as a very attractive method to optimize expensive to evaluate, black box, derivative-free and possibly noisy functions (Shahriari et al. 2016). This framework uses surrogate models, such as the likes of a Gaussian Process (Rasmussen and Williams 2004) which describe a prior belief over the possible objective functions in order to approximate them. The procedure itself is inherently sequential: our function is first evaluated a few times, a surrogate model is then fit with this information, which will later suggest the next point to be evaluated according to a predefined acquisition function. These strategies typically aim to balance exploitation and exploration, that is, areas where the posterior mean or variance of our surrogate model are high respectively.
These strategies have recently grabbed the attention of machine learning researchers over simpler black-box optimization strategies, such as grid search or random search (Bergstra James and Bengio Yoshua 2012). It is specially interesting in areas such as automatic machine-learning hyperparameter optimization (Snoek, Larochelle, and Adams 2012), A/B testing (Chapelle and Li 2011) or recommender systems (Vanchinathan et al. 2014), among others. Furthermore, the framework is entirely modular; there are many choices a user could take regarding the design of the optimization procedure: choice of surrogate model, covariance function, acquisition function behaviour or hyperparameter treatment, to name a few.
引用总数
2018201920202021202220232024175151154
学术搜索中的文章
J Jiménez, J Ginebra - Journal of Open Source Software, 2017