Recent advances in trust region algorithms

Y Yuan - Mathematical Programming, 2015 - Springer
Trust region methods are a class of numerical methods for optimization. Unlike line search
type methods where a line search is carried out in each iteration, trust region methods …

[图书][B] Nonlinear conjugate gradient methods for unconstrained optimization

N Andrei - 2020 - Springer
This book is on conjugate gradient methods for unconstrained optimization. The concept of
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …

Newton-type methods for non-convex optimization under inexact Hessian information

P Xu, F Roosta, MW Mahoney - Mathematical Programming, 2020 - Springer
We consider variants of trust-region and adaptive cubic regularization methods for non-
convex optimization, in which the Hessian matrix is approximated. Under certain condition …

Second-order optimization for non-convex machine learning: An empirical study

P Xu, F Roosta, MW Mahoney - Proceedings of the 2020 SIAM International …, 2020 - SIAM
While first-order optimization methods, such as SGD are popular in machine learning (ML),
they come with well-known deficiencies, including relatively-slow convergence, sensitivity to …

Gradient descent finds the cubic-regularized nonconvex Newton step

Y Carmon, J Duchi - SIAM Journal on Optimization, 2019 - SIAM
We consider the minimization of a nonconvex quadratic form regularized by a cubic term,
which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove …

Distributed second order methods with fast rates and compressed communication

R Islamov, X Qian, P Richtárik - International conference on …, 2021 - proceedings.mlr.press
We develop several new communication-efficient second-order methods for distributed
optimization. Our first method, NEWTON-STAR, is a variant of Newton's method from which it …

[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives

C Cartis, NIM Gould, PL Toint - 2022 - SIAM
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …

QPLIB: a library of quadratic programming instances

F Furini, E Traversi, P Belotti, A Frangioni… - Mathematical …, 2019 - Springer
This paper describes a new instance library for quadratic programming (QP), ie, the family of
continuous and (mixed)-integer optimization problems where the objective function and/or …

Perseus: A simple and optimal high-order method for variational inequalities

T Lin, MI Jordan - Mathematical Programming, 2024 - Springer
This paper settles an open and challenging question pertaining to the design of simple and
optimal high-order methods for solving smooth and monotone variational inequalities (VIs) …

Stochastic subspace cubic Newton method

F Hanzely, N Doikov, Y Nesterov… - … on Machine Learning, 2020 - proceedings.mlr.press
In this paper, we propose a new randomized second-order optimization algorithm—
Stochastic Subspace Cubic Newton (SSCN)—for minimizing a high dimensional convex …