Randomized subspace regularized newton method for unconstrained non-convex optimization
T Fuji, PL Poirion, A Takeda - arXiv preprint arXiv:2209.04170, 2022 - arxiv.org
While there already exist randomized subspace Newton methods that restrict the search
direction to a random subspace for a convex function, we propose a randomized subspace …
direction to a random subspace for a convex function, we propose a randomized subspace …
A globally convergent gradient method with momentum
In this work, we consider smooth unconstrained optimization problems and we deal with the
class of gradient methods with momentum, ie, descent algorithms where the search direction …
class of gradient methods with momentum, ie, descent algorithms where the search direction …
Subspace Quasi-Newton Method with Gradient Approximation
T Miyaishi, R Nozawa, PL Poirion, A Takeda - arXiv preprint arXiv …, 2024 - arxiv.org
In recent years, various subspace algorithms have been developed to handle large-scale
optimization problems. Although existing subspace Newton methods require fewer iterations …
optimization problems. Although existing subspace Newton methods require fewer iterations …
Limited-memory Common-directions Method With Subsampled Newton Directions for Large-scale Linear Classification
The common-directions method is an optimization method recently proposed to utilize
second-order information. It is especially efficient on large-scale linear classification …
second-order information. It is especially efficient on large-scale linear classification …
A Novel Fast Exact Subproblem Solver for Stochastic Quasi-Newton Cubic Regularized Optimization
J Forristal, J Griffin, W Zhou, S Yektamaram - arXiv preprint arXiv …, 2022 - arxiv.org
In this work we describe an Adaptive Regularization using Cubics (ARC) method for large-
scale nonconvex unconstrained optimization using Limited-memory Quasi-Newton (LQN) …
scale nonconvex unconstrained optimization using Limited-memory Quasi-Newton (LQN) …