[PDF][PDF] Sparse Gaussian processes for Bayesian optimization.
Bayesian optimization schemes often rely on Gaussian processes (GP). GP models are very
flexible, but are known to scale poorly with the number of training points. While several
efficient sparse GP models are known, they have limitations when applied in optimization
settings. We propose a novel Bayesian optimization framework that uses sparse online
Gaussian processes. We introduce a new updating scheme for the online GP that accounts
for our preference during optimization for regions with better performance. We apply this …
flexible, but are known to scale poorly with the number of training points. While several
efficient sparse GP models are known, they have limitations when applied in optimization
settings. We propose a novel Bayesian optimization framework that uses sparse online
Gaussian processes. We introduce a new updating scheme for the online GP that accounts
for our preference during optimization for regions with better performance. We apply this …
Abstract
Bayesian optimization schemes often rely on Gaussian processes (GP). GP models are very flexible, but are known to scale poorly with the number of training points. While several efficient sparse GP models are known, they have limitations when applied in optimization settings.
We propose a novel Bayesian optimization framework that uses sparse online Gaussian processes. We introduce a new updating scheme for the online GP that accounts for our preference during optimization for regions with better performance. We apply this method to optimize the performance of a free-electron laser, and demonstrate empirically that the weighted updating scheme leads to substantial improvements to performance in optimization.
auai.org
以上显示的是最相近的搜索结果。 查看全部搜索结果