Random smoothing regularization in kernel gradient descent learning
Random smoothing data augmentation is a unique form of regularization that can prevent
overfitting by introducing noise to the input data, encouraging the model to learn more …
overfitting by introducing noise to the input data, encouraging the model to learn more …
Metamodel optimization of a complex, rural–urban emergency medical services system
M Snyder, BJ Smucker - Simulation Modelling Practice and Theory, 2022 - Elsevier
Complex simulation systems, such as those involving emergency medical services (EMS),
are often too computationally demanding to be used in optimization problems …
are often too computationally demanding to be used in optimization problems …
Sensitivity Analysis on Policy‐Augmented Graphical Hybrid Models With Shapley Value Estimation
Driven by the critical challenges in biomanufacturing, including high complexity and high
uncertainty, we propose a comprehensive and computationally efficient sensitivity analysis …
uncertainty, we propose a comprehensive and computationally efficient sensitivity analysis …
Random Smoothing Regularization in Kernel Gradient Descent Learning
Random smoothing data augmentation is a unique form of regularization that can prevent
overfitting by introducing noise to the input data, encouraging the model to learn more …
overfitting by introducing noise to the input data, encouraging the model to learn more …
A Sparse Expansion For Deep Gaussian Processes
L Ding, R Tuo, S Shahrampour - arXiv preprint arXiv:2112.05888, 2021 - arxiv.org
In this work, we use Deep Gaussian Processes (DGPs) as statistical surrogates for
stochastic processes with complex distributions. Conventional inferential methods for DGP …
stochastic processes with complex distributions. Conventional inferential methods for DGP …
Language Model Prompt Selection via Simulation Optimization
With the advancement in generative language models, the selection of prompts has gained
significant attention in recent years. A prompt is an instruction or description provided by the …
significant attention in recent years. A prompt is an instruction or description provided by the …
Kernel Multigrid: Accelerate Back-fitting via Sparse Gaussian Process Regression
L Zou, L Ding - arXiv preprint arXiv:2403.13300, 2024 - arxiv.org
Additive Gaussian Processes (GPs) are popular approaches for nonparametric feature
selection. The common training method for these models is Bayesian Back-fitting. However …
selection. The common training method for these models is Bayesian Back-fitting. However …
A sparse expansion for deep Gaussian processes
L Ding, R Tuo, S Shahrampour - IISE Transactions, 2024 - Taylor & Francis
In this work, we use Deep Gaussian Processes (DGPs) as statistical surrogates for
stochastic processes with complex distributions. Conventional inferential methods for DGP …
stochastic processes with complex distributions. Conventional inferential methods for DGP …
High-dimensional simulation optimization via brownian fields and sparse grids
High-dimensional simulation optimization is notoriously challenging. We propose a new
sampling algorithm that converges to a global optimal solution and suffers minimally from …
sampling algorithm that converges to a global optimal solution and suffers minimally from …
Representing Additive Gaussian Processes by Sparse Matrices
L Zou, H Chen, L Ding - arXiv preprint arXiv:2305.00324, 2023 - arxiv.org
Among generalized additive models, additive Mat\'ern Gaussian Processes (GPs) are one of
the most popular for scalable high-dimensional problems. Thanks to their additive structure …
the most popular for scalable high-dimensional problems. Thanks to their additive structure …