Misspecified gaussian process bandit optimization

I Bogunovic, A Krause - Advances in neural information …, 2021 - proceedings.neurips.cc
We consider the problem of optimizing a black-box function based on noisy bandit feedback.
Kernelized bandit algorithms have shown strong empirical and theoretical performance for …

Maximum likelihood estimation in Gaussian process regression is ill-posed

T Karvonen, CJ Oates - Journal of Machine Learning Research, 2023 - jmlr.org
Gaussian process regression underpins countless academic and industrial applications of
machine learning and statistics, with maximum likelihood estimation routinely used to select …

Convergence of Gaussian process regression with estimated hyper-parameters and applications in Bayesian inverse problems

AL Teckentrup - SIAM/ASA Journal on Uncertainty Quantification, 2020 - SIAM
This work is concerned with the convergence of Gaussian process regression. A particular
focus is on hierarchical Gaussian process regression, where hyper-parameters appearing in …

A kernel two-sample test for functional data

G Wynne, AB Duncan - Journal of Machine Learning Research, 2022 - jmlr.org
We propose a nonparametric two-sample test procedure based on Maximum Mean
Discrepancy (MMD) for testing the hypothesis that two samples of functions have the same …

A hierarchical expected improvement method for bayesian optimization

Z Chen, S Mak, CFJ Wu - Journal of the American Statistical …, 2024 - Taylor & Francis
Abstract The Expected Improvement (EI) method, proposed by Jones, Schonlau, andWelch,
is a widely used Bayesian optimization method, which makes use of a fitted Gaussian …

Gaussian process regression: Optimality, robustness, and relationship with kernel ridge regression

W Wang, BY Jing - Journal of Machine Learning Research, 2022 - jmlr.org
Gaussian process regression is widely used in many fields, for example, machine learning,
reinforcement learning and uncertainty quantification. One key component of Gaussian …

Optimally-weighted estimators of the maximum mean discrepancy for likelihood-free inference

A Bharti, M Naslidnyk, O Key… - … on Machine Learning, 2023 - proceedings.mlr.press
Likelihood-free inference methods typically make use of a distance between simulated and
real data. A common example is the maximum mean discrepancy (MMD), which has …

Maximum likelihood estimation and uncertainty quantification for Gaussian process approximation of deterministic functions

T Karvonen, G Wynne, F Tronarp, C Oates… - SIAM/ASA Journal on …, 2020 - SIAM
Despite the ubiquity of the Gaussian process regression model, few theoretical results are
available that account for the fact that parameters of the covariance kernel typically need to …

Self-correcting bayesian optimization through bayesian active learning

C Hvarfner, E Hellsten, F Hutter… - Advances in Neural …, 2024 - proceedings.neurips.cc
Gaussian processes are the model of choice in Bayesian optimization and active learning.
Yet, they are highly dependent on cleverly chosen hyperparameters to reach their full …

Kriging prediction with isotropic Matérn correlations: Robustness and experimental designs

R Tuo, W Wang - Journal of Machine Learning Research, 2020 - jmlr.org
This work investigates the prediction performance of the kriging predictors. We derive some
error bounds for the prediction error in terms of non-asymptotic probability under the uniform …