Fast convergence to non-isolated minima: four equivalent conditions for functions

Q Rebjock, N Boumal - Mathematical Programming, 2024 - Springer
Optimization algorithms can see their local convergence rates deteriorate when the Hessian
at the optimum is singular. These singularities are inescapable when the optima are non …

Random coordinate descent: a simple alternative for optimizing parameterized quantum circuits

Z Ding, T Ko, J Yao, L Lin, X Li - Physical Review Research, 2024 - APS
Variational quantum algorithms rely on the optimization of parameterized quantum circuits in
noisy settings. The commonly used back-propagation procedure in classical machine …

Convergence of a normal map-based prox-sgd method under the kl inequality

A Milzarek, J Qiu - arXiv preprint arXiv:2305.05828, 2023 - arxiv.org
In this paper, we present a novel stochastic normal map-based algorithm ($\mathsf
{norM}\text {-}\mathsf {SGD} $) for nonconvex composite-type optimization problems and …

Reinforcement Learning and Variational Quantum Algorithms

J Yao - 2023 - escholarship.org
In recent years, the realms of deep learning and variational quantum algorithms have
undergone significant advancements. These innovative algorithms have proven to be …

Counterexamples for Noise Models of Stochastic Gradients

V Patel - Examples and Counterexamples, 2023 - Elsevier
Abstract Stochastic Gradient Descent (SGD) is a widely used, foundational algorithm in data
science and machine learning. As a result, analyses of SGD abound making use of a variety …