RSN: randomized subspace Newton
We develop a randomized Newton method capable of solving learning problems with huge
dimensional feature spaces, which is a common setting in applications such as medical …
dimensional feature spaces, which is a common setting in applications such as medical …
SDNA: Stochastic dual Newton ascent for empirical risk minimization
We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual
Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random …
Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random …
Stochastic dual coordinate ascent with adaptive probabilities
D Csiba, Z Qu, P Richtárik - International Conference on …, 2015 - proceedings.mlr.press
This paper introduces AdaSDCA: an adaptive variant of stochastic dual coordinate ascent
(SDCA) for solving the regularized empirical risk minimization problems. Our modification …
(SDCA) for solving the regularized empirical risk minimization problems. Our modification …
Let's Make Block Coordinate Descent Converge Faster: Faster Greedy Rules, Message-Passing, Active-Set Complexity, and Superlinear Convergence
Block coordinate descent (BCD) methods are widely used for large-scale numerical
optimization because of their cheap iteration costs, low memory requirements, amenability to …
optimization because of their cheap iteration costs, low memory requirements, amenability to …
Globalized inexact proximal Newton-type methods for nonconvex composite functions
C Kanzow, T Lechner - Computational Optimization and Applications, 2021 - Springer
Optimization problems with composite functions consist of an objective function which is the
sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the …
sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the …
[图书][B] Lyapunov arguments in optimization
A Wilson - 2018 - search.proquest.com
Optimization is among the richest modeling languages in science. In statistics and machine
learning, for instance, inference is typically posed as an optimization problem. While there …
learning, for instance, inference is typically posed as an optimization problem. While there …
Let's make block coordinate descent converge faster: faster greedy rules, message-passing, active-set complexity, and superlinear convergence
Block coordinate descent (BCD) methods are widely used for large-scale numerical
optimization because of their cheap iteration costs, low memory requirements, amenability to …
optimization because of their cheap iteration costs, low memory requirements, amenability to …
A Fast Active Set Block Coordinate Descent Algorithm for -Regularized Least Squares
The problem of finding sparse solutions to underdetermined systems of linear equations
arises in several applications (eg, signal and image processing, compressive sensing …
arises in several applications (eg, signal and image processing, compressive sensing …
Convergence analysis of inexact randomized iterative methods
N Loizou, P Richtárik - SIAM Journal on Scientific Computing, 2020 - SIAM
In this paper we present a convergence rate analysis of inexact variants of several
randomized iterative methods for solving three closely related problems: a convex stochastic …
randomized iterative methods for solving three closely related problems: a convex stochastic …
Efficient regularized proximal quasi-Newton methods for large-scale nonconvex composite optimization problems
C Kanzow, T Lechner - arXiv preprint arXiv:2210.07644, 2022 - arxiv.org
Optimization problems with composite functions consist of an objective function which is the
sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the …
sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the …