Random smoothing regularization in kernel gradient descent learning

L Ding, T Hu, J Jiang, D Li, W Wang, Y Yao - arXiv preprint arXiv …, 2023 - arxiv.org
Random smoothing data augmentation is a unique form of regularization that can prevent
overfitting by introducing noise to the input data, encouraging the model to learn more …

Metamodel optimization of a complex, rural–urban emergency medical services system

M Snyder, BJ Smucker - Simulation Modelling Practice and Theory, 2022 - Elsevier
Complex simulation systems, such as those involving emergency medical services (EMS),
are often too computationally demanding to be used in optimization problems …

Sensitivity Analysis on Policy‐Augmented Graphical Hybrid Models With Shapley Value Estimation

J Zhao, W Xie, J Luo - Naval Research Logistics (NRL), 2024 - Wiley Online Library
Driven by the critical challenges in biomanufacturing, including high complexity and high
uncertainty, we propose a comprehensive and computationally efficient sensitivity analysis …

Random Smoothing Regularization in Kernel Gradient Descent Learning

L Ding, T Hu, J Jiang, D Li, W Wang, Y Yao - Journal of Machine Learning …, 2024 - jmlr.org
Random smoothing data augmentation is a unique form of regularization that can prevent
overfitting by introducing noise to the input data, encouraging the model to learn more …

A Sparse Expansion For Deep Gaussian Processes

L Ding, R Tuo, S Shahrampour - arXiv preprint arXiv:2112.05888, 2021 - arxiv.org
In this work, we use Deep Gaussian Processes (DGPs) as statistical surrogates for
stochastic processes with complex distributions. Conventional inferential methods for DGP …

Language Model Prompt Selection via Simulation Optimization

H Zhang, J He, R Righter, Z Zheng - arXiv preprint arXiv:2404.08164, 2024 - arxiv.org
With the advancement in generative language models, the selection of prompts has gained
significant attention in recent years. A prompt is an instruction or description provided by the …

Kernel Multigrid: Accelerate Back-fitting via Sparse Gaussian Process Regression

L Zou, L Ding - arXiv preprint arXiv:2403.13300, 2024 - arxiv.org
Additive Gaussian Processes (GPs) are popular approaches for nonparametric feature
selection. The common training method for these models is Bayesian Back-fitting. However …

A sparse expansion for deep Gaussian processes

L Ding, R Tuo, S Shahrampour - IISE Transactions, 2024 - Taylor & Francis
In this work, we use Deep Gaussian Processes (DGPs) as statistical surrogates for
stochastic processes with complex distributions. Conventional inferential methods for DGP …

High-dimensional simulation optimization via brownian fields and sparse grids

L Ding, R Tuo, X Zhang - arXiv preprint arXiv:2107.08595, 2021 - arxiv.org
High-dimensional simulation optimization is notoriously challenging. We propose a new
sampling algorithm that converges to a global optimal solution and suffers minimally from …

Representing Additive Gaussian Processes by Sparse Matrices

L Zou, H Chen, L Ding - arXiv preprint arXiv:2305.00324, 2023 - arxiv.org
Among generalized additive models, additive Mat\'ern Gaussian Processes (GPs) are one of
the most popular for scalable high-dimensional problems. Thanks to their additive structure …