Tuning hyperparameters without grad students: Scalable and robust bayesian optimisation with dragonfly
K Kandasamy, KR Vysyaraju, W Neiswanger… - Journal of Machine …, 2020 - jmlr.org
Bayesian Optimisation (BO) refers to a suite of techniques for global optimisation of
expensive black box functions, which use introspective Bayesian models of the function to …
expensive black box functions, which use introspective Bayesian models of the function to …
Data-driven polynomial chaos expansion for machine learning regression
We present a regression technique for data-driven problems based on polynomial chaos
expansion (PCE). PCE is a popular technique in the field of uncertainty quantification (UQ) …
expansion (PCE). PCE is a popular technique in the field of uncertainty quantification (UQ) …
Gaussian process bandit optimisation with multi-fidelity evaluations
In many scientific and engineering applications, we are tasked with the optimisation of an
expensive to evaluate black box function $\func $. Traditional methods for this problem …
expensive to evaluate black box function $\func $. Traditional methods for this problem …
Sparse interaction additive networks via feature interaction detection and sparse selection
There is currently a large gap in performance between the statistically rigorous methods like
linear regression or additive splines and the powerful deep methods using neural networks …
linear regression or additive splines and the powerful deep methods using neural networks …
Sensitivity analysis via the proportion of unmeasured confounding
M Bonvini, EH Kennedy - Journal of the American Statistical …, 2022 - Taylor & Francis
In observational studies, identification of ATEs is generally achieved by assuming that the
correct set of confounders has been measured and properly included in the relevant models …
correct set of confounders has been measured and properly included in the relevant models …
Multi-fidelity gaussian process bandit optimisation
In many scientific and engineering applications, we are tasked with the maximisation of an
expensive to evaluate black box function f. Traditional settings for this problem assume just …
expensive to evaluate black box function f. Traditional settings for this problem assume just …
Sparse modal additive model
Sparse additive models have been successfully applied to high-dimensional data analysis
due to the flexibility and interpretability of their representation. However, the existing …
due to the flexibility and interpretability of their representation. However, the existing …
Exploring the linear subspace hypothesis in gender bias mitigation
F Vargas, R Cotterell - arXiv preprint arXiv:2009.09435, 2020 - arxiv.org
Bolukbasi et al.(2016) presents one of the first gender bias mitigation techniques for word
embeddings. Their method takes pre-trained word embeddings as input and attempts to …
embeddings. Their method takes pre-trained word embeddings as input and attempts to …
Generalization bounds for sparse random feature expansions
Random feature methods have been successful in various machine learning tasks, are easy
to compute, and come with theoretical accuracy bounds. They serve as an alternative …
to compute, and come with theoretical accuracy bounds. They serve as an alternative …
Shrimp: Sparser random feature models via iterative magnitude pruning
Y Xie, R Shi, H Schaeffer… - … and Scientific Machine …, 2022 - proceedings.mlr.press
Sparse shrunk additive models and sparse random feature models have been developed
separately as methods to learn low-order functions, where there are few interactions …
separately as methods to learn low-order functions, where there are few interactions …