Recent advances in trust region algorithms
Y Yuan - Mathematical Programming, 2015 - Springer
Trust region methods are a class of numerical methods for optimization. Unlike line search
type methods where a line search is carried out in each iteration, trust region methods …
type methods where a line search is carried out in each iteration, trust region methods …
[图书][B] An introduction to optimization on smooth manifolds
N Boumal - 2023 - books.google.com
Optimization on Riemannian manifolds-the result of smooth geometry and optimization
merging into one elegant modern framework-spans many areas of science and engineering …
merging into one elegant modern framework-spans many areas of science and engineering …
[图书][B] Lectures on convex optimization
Y Nesterov - 2018 - Springer
Yurii Nesterov Second Edition Page 1 Springer Optimization and Its Applications 137 Yurii
Nesterov Lectures on Convex Optimization Second Edition Page 2 Springer Optimization and …
Nesterov Lectures on Convex Optimization Second Edition Page 2 Springer Optimization and …
[图书][B] Nonlinear conjugate gradient methods for unconstrained optimization
N Andrei - 2020 - Springer
This book is on conjugate gradient methods for unconstrained optimization. The concept of
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …
Newton-type methods for non-convex optimization under inexact Hessian information
We consider variants of trust-region and adaptive cubic regularization methods for non-
convex optimization, in which the Hessian matrix is approximated. Under certain condition …
convex optimization, in which the Hessian matrix is approximated. Under certain condition …
Fast convergence to non-isolated minima: four equivalent conditions for functions
Optimization algorithms can see their local convergence rates deteriorate when the Hessian
at the optimum is singular. These singularities are inescapable when the optima are non …
at the optimum is singular. These singularities are inescapable when the optima are non …
Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results
Abstract An Adaptive Regularisation algorithm using Cubics (ARC) is proposed for
unconstrained optimization, generalizing at the same time an unpublished method due to …
unconstrained optimization, generalizing at the same time an unpublished method due to …
Second-order optimization for non-convex machine learning: An empirical study
While first-order optimization methods, such as SGD are popular in machine learning (ML),
they come with well-known deficiencies, including relatively-slow convergence, sensitivity to …
they come with well-known deficiencies, including relatively-slow convergence, sensitivity to …
Adaptive cubic regularisation methods for unconstrained optimization. Part II: worst-case function-and derivative-evaluation complexity
Abstract An Adaptive Regularisation framework using Cubics (ARC) was proposed for
unconstrained optimization and analysed in Cartis, Gould and Toint (Part I, Math Program …
unconstrained optimization and analysed in Cartis, Gould and Toint (Part I, Math Program …
Gradient descent finds the cubic-regularized nonconvex Newton step
We consider the minimization of a nonconvex quadratic form regularized by a cubic term,
which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove …
which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove …