Zeroth-order algorithms for stochastic distributed nonconvex optimization

X Yi, S Zhang, T Yang, KH Johansson - Automatica, 2022 - Elsevier
In this paper, we consider a stochastic distributed nonconvex optimization problem with the
cost function being distributed over n agents having access only to zeroth-order (ZO) …

An accelerated directional derivative method for smooth stochastic convex optimization

P Dvurechensky, E Gorbunov, A Gasnikov - European Journal of …, 2021 - Elsevier
We consider smooth stochastic convex optimization problems in the context of algorithms
which are based on directional derivatives of the objective function. This context can be …

On the second-order convergence properties of random search methods

A Lucchi, A Orvieto, A Solomou - Advances in Neural …, 2021 - proceedings.neurips.cc
We study the theoretical convergence properties of random-search methods when
optimizing non-convex objective functions without having access to derivatives. We prove …

An accelerated method for derivative-free smooth stochastic convex optimization

E Gorbunov, P Dvurechensky, A Gasnikov - arXiv preprint arXiv …, 2018 - arxiv.org
We consider an unconstrained problem of minimizing a smooth convex function which is
only available through noisy observations of its values, the noise consisting of two parts …

Safe zeroth-order convex optimization using quadratic local approximations

B Guo, Y Jiang, M Kamgarpour… - 2023 European …, 2023 - ieeexplore.ieee.org
We address black-box convex optimization problems, where the objective and constraint
functions are not explicitly known but can be sampled within the feasible set. The challenge …

Adaptive evolution strategies for stochastic zeroth-order optimization

X He, Z Zheng, Z Chen, Y Zhou - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
We consider solving a class of unconstrained optimization problems in which only stochastic
estimates of the objective functions are available. Existing stochastic optimization methods …

Safe zeroth-order optimization using quadratic local approximations

B Guo, Y Jiang, G Ferrari-Trecate… - arXiv preprint arXiv …, 2023 - arxiv.org
This paper addresses black-box smooth optimization problems, where the objective and
constraint functions are not explicitly known but can be queried. The main goal of this work is …

On the global complexity of a derivative-free Levenberg-Marquardt algorithm via orthogonal spherical smoothing

X Chen, J Fan - Journal of Scientific Computing, 2024 - Springer
In this paper, we propose a derivative-free Levenberg-Marquardt algorithm for nonlinear
least squares problems, where the Jacobian matrices are approximated via orthogonal …

Black-box reductions for zeroth-order gradient algorithms to achieve lower query complexity

B Gu, X Wei, S Gao, Z Xiong, C Deng… - Journal of Machine …, 2021 - jmlr.org
Zeroth-order (ZO) optimization has been the key technique for various machine learning
applications especially for black-box adversarial attack, where models need to be learned in …

Obtaining Lower Query Complexities Through Lightweight Zeroth-Order Proximal Gradient Algorithms

B Gu, X Wei, H Zhang, Y Chang, H Huang - Neural Computation, 2024 - direct.mit.edu
Zeroth-order (ZO) optimization is one key technique for machine learning problems where
gradient calculation is expensive or impossible. Several variance, reduced ZO proximal …