Recent advances in trust region algorithms

Y Yuan - Mathematical Programming, 2015 - Springer
Trust region methods are a class of numerical methods for optimization. Unlike line search
type methods where a line search is carried out in each iteration, trust region methods …

[图书][B] An introduction to optimization on smooth manifolds

N Boumal - 2023 - books.google.com
Optimization on Riemannian manifolds-the result of smooth geometry and optimization
merging into one elegant modern framework-spans many areas of science and engineering …

[图书][B] Lectures on convex optimization

Y Nesterov - 2018 - Springer
Yurii Nesterov Second Edition Page 1 Springer Optimization and Its Applications 137 Yurii
Nesterov Lectures on Convex Optimization Second Edition Page 2 Springer Optimization and …

[图书][B] Nonlinear conjugate gradient methods for unconstrained optimization

N Andrei - 2020 - Springer
This book is on conjugate gradient methods for unconstrained optimization. The concept of
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …

Newton-type methods for non-convex optimization under inexact Hessian information

P Xu, F Roosta, MW Mahoney - Mathematical Programming, 2020 - Springer
We consider variants of trust-region and adaptive cubic regularization methods for non-
convex optimization, in which the Hessian matrix is approximated. Under certain condition …

Fast convergence to non-isolated minima: four equivalent conditions for functions

Q Rebjock, N Boumal - Mathematical Programming, 2024 - Springer
Optimization algorithms can see their local convergence rates deteriorate when the Hessian
at the optimum is singular. These singularities are inescapable when the optima are non …

Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results

C Cartis, NIM Gould, PL Toint - Mathematical Programming, 2011 - Springer
Abstract An Adaptive Regularisation algorithm using Cubics (ARC) is proposed for
unconstrained optimization, generalizing at the same time an unpublished method due to …

Second-order optimization for non-convex machine learning: An empirical study

P Xu, F Roosta, MW Mahoney - Proceedings of the 2020 SIAM International …, 2020 - SIAM
While first-order optimization methods, such as SGD are popular in machine learning (ML),
they come with well-known deficiencies, including relatively-slow convergence, sensitivity to …

Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function-and derivative-evaluation complexity

C Cartis, NIM Gould, PL Toint - Mathematical programming, 2011 - Springer
Abstract An Adaptive Regularisation framework using Cubics (ARC) was proposed for
unconstrained optimization and analysed in Cartis, Gould and Toint (Part I, Math Program …

Gradient descent finds the cubic-regularized nonconvex Newton step

Y Carmon, J Duchi - SIAM Journal on Optimization, 2019 - SIAM
We consider the minimization of a nonconvex quadratic form regularized by a cubic term,
which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove …