Recent advances in trust region algorithms
Y Yuan - Mathematical Programming, 2015 - Springer
Trust region methods are a class of numerical methods for optimization. Unlike line search
type methods where a line search is carried out in each iteration, trust region methods …
type methods where a line search is carried out in each iteration, trust region methods …
[图书][B] Nonlinear conjugate gradient methods for unconstrained optimization
N Andrei - 2020 - Springer
This book is on conjugate gradient methods for unconstrained optimization. The concept of
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …
Newton-type methods for non-convex optimization under inexact Hessian information
We consider variants of trust-region and adaptive cubic regularization methods for non-
convex optimization, in which the Hessian matrix is approximated. Under certain condition …
convex optimization, in which the Hessian matrix is approximated. Under certain condition …
Second-order optimization for non-convex machine learning: An empirical study
While first-order optimization methods, such as SGD are popular in machine learning (ML),
they come with well-known deficiencies, including relatively-slow convergence, sensitivity to …
they come with well-known deficiencies, including relatively-slow convergence, sensitivity to …
Gradient descent finds the cubic-regularized nonconvex Newton step
We consider the minimization of a nonconvex quadratic form regularized by a cubic term,
which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove …
which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove …
Distributed second order methods with fast rates and compressed communication
We develop several new communication-efficient second-order methods for distributed
optimization. Our first method, NEWTON-STAR, is a variant of Newton's method from which it …
optimization. Our first method, NEWTON-STAR, is a variant of Newton's method from which it …
[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …
live in the best possible world, and the latter is afraid that the former might be right.… In that …
QPLIB: a library of quadratic programming instances
This paper describes a new instance library for quadratic programming (QP), ie, the family of
continuous and (mixed)-integer optimization problems where the objective function and/or …
continuous and (mixed)-integer optimization problems where the objective function and/or …
Perseus: A simple and optimal high-order method for variational inequalities
This paper settles an open and challenging question pertaining to the design of simple and
optimal high-order methods for solving smooth and monotone variational inequalities (VIs) …
optimal high-order methods for solving smooth and monotone variational inequalities (VIs) …
Stochastic subspace cubic Newton method
In this paper, we propose a new randomized second-order optimization algorithm—
Stochastic Subspace Cubic Newton (SSCN)—for minimizing a high dimensional convex …
Stochastic Subspace Cubic Newton (SSCN)—for minimizing a high dimensional convex …