Zeroth-order algorithms for stochastic distributed nonconvex optimization
In this paper, we consider a stochastic distributed nonconvex optimization problem with the
cost function being distributed over n agents having access only to zeroth-order (ZO) …
cost function being distributed over n agents having access only to zeroth-order (ZO) …
An accelerated directional derivative method for smooth stochastic convex optimization
We consider smooth stochastic convex optimization problems in the context of algorithms
which are based on directional derivatives of the objective function. This context can be …
which are based on directional derivatives of the objective function. This context can be …
On the second-order convergence properties of random search methods
We study the theoretical convergence properties of random-search methods when
optimizing non-convex objective functions without having access to derivatives. We prove …
optimizing non-convex objective functions without having access to derivatives. We prove …
An accelerated method for derivative-free smooth stochastic convex optimization
We consider an unconstrained problem of minimizing a smooth convex function which is
only available through noisy observations of its values, the noise consisting of two parts …
only available through noisy observations of its values, the noise consisting of two parts …
Safe zeroth-order convex optimization using quadratic local approximations
We address black-box convex optimization problems, where the objective and constraint
functions are not explicitly known but can be sampled within the feasible set. The challenge …
functions are not explicitly known but can be sampled within the feasible set. The challenge …
Adaptive evolution strategies for stochastic zeroth-order optimization
We consider solving a class of unconstrained optimization problems in which only stochastic
estimates of the objective functions are available. Existing stochastic optimization methods …
estimates of the objective functions are available. Existing stochastic optimization methods …
Safe zeroth-order optimization using quadratic local approximations
This paper addresses black-box smooth optimization problems, where the objective and
constraint functions are not explicitly known but can be queried. The main goal of this work is …
constraint functions are not explicitly known but can be queried. The main goal of this work is …
On the global complexity of a derivative-free Levenberg-Marquardt algorithm via orthogonal spherical smoothing
X Chen, J Fan - Journal of Scientific Computing, 2024 - Springer
In this paper, we propose a derivative-free Levenberg-Marquardt algorithm for nonlinear
least squares problems, where the Jacobian matrices are approximated via orthogonal …
least squares problems, where the Jacobian matrices are approximated via orthogonal …
Black-box reductions for zeroth-order gradient algorithms to achieve lower query complexity
Zeroth-order (ZO) optimization has been the key technique for various machine learning
applications especially for black-box adversarial attack, where models need to be learned in …
applications especially for black-box adversarial attack, where models need to be learned in …
Obtaining Lower Query Complexities Through Lightweight Zeroth-Order Proximal Gradient Algorithms
Zeroth-order (ZO) optimization is one key technique for machine learning problems where
gradient calculation is expensive or impossible. Several variance, reduced ZO proximal …
gradient calculation is expensive or impossible. Several variance, reduced ZO proximal …