A simple stochastic variance reduced algorithm with fast convergence rates
Recent years have witnessed exciting progress in the study of stochastic variance reduced
gradient methods (eg, SVRG, SAGA), their accelerated variants (eg, Katyusha) and their …
gradient methods (eg, SVRG, SAGA), their accelerated variants (eg, Katyusha) and their …
VR-SGD: A simple stochastic variance reduction method for machine learning
In this paper, we propose a simple variant of the original SVRG, called variance reduced
stochastic gradient descent (VR-SGD). Unlike the choices of snapshot and starting points in …
stochastic gradient descent (VR-SGD). Unlike the choices of snapshot and starting points in …
Fastest rates for stochastic mirror descent methods
F Hanzely, P Richtárik - Computational Optimization and Applications, 2021 - Springer
Relative smoothness—a notion introduced in Birnbaum et al.(Proceedings of the 12th ACM
conference on electronic commerce, ACM, pp 127–136, 2011) and recently rediscovered in …
conference on electronic commerce, ACM, pp 127–136, 2011) and recently rediscovered in …
Accelerated variance reduced stochastic ADMM
Recently, many variance reduced stochastic alternating direction method of multipliers
(ADMM) methods (eg SAG-ADMM, SDCA-ADMM and SVRG-ADMM) have made exciting …
(ADMM) methods (eg SAG-ADMM, SDCA-ADMM and SVRG-ADMM) have made exciting …
Accelerated variance reduction stochastic ADMM for large-scale machine learning
Recently, many stochastic variance reduced alternating direction methods of multipliers
(ADMMs)(eg, SAG-ADMM and SVRG-ADMM) have made exciting progress such as linear …
(ADMMs)(eg, SAG-ADMM and SVRG-ADMM) have made exciting progress such as linear …
Asvrg: Accelerated proximal svrg
This paper proposes an accelerated proximal stochastic variance reduced gradient
(ASVRG) method, in which we design a simple and effective momentum acceleration trick …
(ASVRG) method, in which we design a simple and effective momentum acceleration trick …
Fast stochastic variance reduced admm for stochastic composition optimization
We consider the stochastic composition optimization problem proposed in\cite
{wang2017stochastic}, which has applications ranging from estimation to statistical and …
{wang2017stochastic}, which has applications ranging from estimation to statistical and …
An inexact primal-dual smoothing framework for large-scale non-bilinear saddle point problems
We develop an inexact primal-dual first-order smoothing framework to solve a class of non-
bilinear saddle point problems with primal strong convexity. Compared with existing …
bilinear saddle point problems with primal strong convexity. Compared with existing …
Exploring fast and communication-efficient algorithms in large-scale distributed networks
The communication overhead has become a significant bottleneck in data-parallel network
with the increasing of model size and data samples. In this work, we propose a new …
with the increasing of model size and data samples. In this work, we propose a new …
Fast stochastic variance reduced gradient method with momentum acceleration for machine learning
Recently, research on accelerated stochastic gradient descent methods (eg, SVRG) has
made exciting progress (eg, linear convergence for strongly convex problems). However, the …
made exciting progress (eg, linear convergence for strongly convex problems). However, the …