Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
Nonconvex and nonsmooth optimization problems are frequently encountered in much of
statistics, business, science and engineering, but they are not yet widely recognized as a …
statistics, business, science and engineering, but they are not yet widely recognized as a …
Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization
There has been much recent interest in finding unconstrained local minima of smooth
functions, due in part to the prevalence of such problems in machine learning and robust …
functions, due in part to the prevalence of such problems in machine learning and robust …
A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization
We consider minimization of a smooth nonconvex objective function using an iterative
algorithm based on Newton's method and the linear conjugate gradient algorithm, with …
algorithm based on Newton's method and the linear conjugate gradient algorithm, with …
[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …
live in the best possible world, and the latter is afraid that the former might be right.… In that …
Gradient regularization of Newton method with Bregman distances
N Doikov, Y Nesterov - Mathematical programming, 2024 - Springer
In this paper, we propose a first second-order scheme based on arbitrary non-Euclidean
norms, incorporated by Bregman distances. They are introduced directly in the Newton …
norms, incorporated by Bregman distances. They are introduced directly in the Newton …
Escaping saddle points in constrained optimization
A Mokhtari, A Ozdaglar… - Advances in Neural …, 2018 - proceedings.neurips.cc
In this paper, we study the problem of escaping from saddle points in smooth nonconvex
optimization problems subject to a convex set $\mathcal {C} $. We propose a generic …
optimization problems subject to a convex set $\mathcal {C} $. We propose a generic …
Regularized Newton methods for minimizing functions with Hölder continuous Hessians
GN Grapiglia, Y Nesterov - SIAM Journal on Optimization, 2017 - SIAM
In this paper, we study the regularized second-order methods for unconstrained
minimization of a twice-differentiable (convex or nonconvex) objective function. For the …
minimization of a twice-differentiable (convex or nonconvex) objective function. For the …
A Newton-CG based augmented Lagrangian method for finding a second-order stationary point of nonconvex equality constrained optimization with complexity …
In this paper we consider finding a second-order stationary point (SOSP) of nonconvex
equality constrained optimization when a nearly feasible point is known. In particular, we first …
equality constrained optimization when a nearly feasible point is known. In particular, we first …
Trust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimization
Worst-case complexity guarantees for nonconvex optimization algorithms have been a topic
of growing interest. Multiple frameworks that achieve the best known complexity bounds …
of growing interest. Multiple frameworks that achieve the best known complexity bounds …
Stochastic variance-reduced cubic regularized Newton methods
We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for
non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient …
non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient …