Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis

B Jiang, T Lin, S Ma, S Zhang - Computational Optimization and …, 2019 - Springer
Nonconvex and nonsmooth optimization problems are frequently encountered in much of
statistics, business, science and engineering, but they are not yet widely recognized as a …

Complexity analysis of second-order line-search algorithms for smooth nonconvex optimization

CW Royer, SJ Wright - SIAM Journal on Optimization, 2018 - SIAM
There has been much recent interest in finding unconstrained local minima of smooth
functions, due in part to the prevalence of such problems in machine learning and robust …

A Newton-CG algorithm with complexity guarantees for smooth unconstrained optimization

CW Royer, M O'Neill, SJ Wright - Mathematical Programming, 2020 - Springer
We consider minimization of a smooth nonconvex objective function using an iterative
algorithm based on Newton's method and the linear conjugate gradient algorithm, with …

[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives

C Cartis, NIM Gould, PL Toint - 2022 - SIAM
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …

Gradient regularization of Newton method with Bregman distances

N Doikov, Y Nesterov - Mathematical programming, 2024 - Springer
In this paper, we propose a first second-order scheme based on arbitrary non-Euclidean
norms, incorporated by Bregman distances. They are introduced directly in the Newton …

Escaping saddle points in constrained optimization

A Mokhtari, A Ozdaglar… - Advances in Neural …, 2018 - proceedings.neurips.cc
In this paper, we study the problem of escaping from saddle points in smooth nonconvex
optimization problems subject to a convex set $\mathcal {C} $. We propose a generic …

Regularized Newton methods for minimizing functions with Hölder continuous Hessians

GN Grapiglia, Y Nesterov - SIAM Journal on Optimization, 2017 - SIAM
In this paper, we study the regularized second-order methods for unconstrained
minimization of a twice-differentiable (convex or nonconvex) objective function. For the …

A Newton-CG based augmented Lagrangian method for finding a second-order stationary point of nonconvex equality constrained optimization with complexity …

C He, Z Lu, TK Pong - SIAM Journal on Optimization, 2023 - SIAM
In this paper we consider finding a second-order stationary point (SOSP) of nonconvex
equality constrained optimization when a nearly feasible point is known. In particular, we first …

Trust-region Newton-CG with strong second-order complexity guarantees for nonconvex optimization

FE Curtis, DP Robinson, CW Royer, SJ Wright - SIAM Journal on Optimization, 2021 - SIAM
Worst-case complexity guarantees for nonconvex optimization algorithms have been a topic
of growing interest. Multiple frameworks that achieve the best known complexity bounds …

Stochastic variance-reduced cubic regularized Newton methods

D Zhou, P Xu, Q Gu - International Conference on Machine …, 2018 - proceedings.mlr.press
We propose a stochastic variance-reduced cubic regularized Newton method (SVRC) for
non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient …