SGD: General analysis and improved rates
We propose a general yet simple theorem describing the convergence of SGD under the
arbitrary sampling paradigm. Our theorem describes the convergence of an infinite array of …
arbitrary sampling paradigm. Our theorem describes the convergence of an infinite array of …
Intratomo: self-supervised learning-based tomography via sinogram synthesis and prediction
We propose IntraTomo, a powerful framework that combines the benefits of learning-based
and model-based approaches for solving highly ill-posed inverse problems in the Computed …
and model-based approaches for solving highly ill-posed inverse problems in the Computed …
Momentum and stochastic momentum for stochastic gradient, newton, proximal point and subspace descent methods
N Loizou, P Richtárik - Computational Optimization and Applications, 2020 - Springer
In this paper we study several classes of stochastic optimization algorithms enriched with
heavy ball momentum. Among the methods studied are: stochastic gradient descent …
heavy ball momentum. Among the methods studied are: stochastic gradient descent …
Convex optimization algorithms in medical image reconstruction—in the age of AI
The past decade has seen the rapid growth of model based image reconstruction (MBIR)
algorithms, which are often applications or adaptations of convex optimization algorithms …
algorithms, which are often applications or adaptations of convex optimization algorithms …
ProxSARAH: An efficient algorithmic framework for stochastic composite nonconvex optimization
We propose a new stochastic first-order algorithmic framework to solve stochastic composite
nonconvex optimization problems that covers both finite-sum and expectation settings. Our …
nonconvex optimization problems that covers both finite-sum and expectation settings. Our …
Practical large-scale linear programming using primal-dual hybrid gradient
We present PDLP, a practical first-order method for linear programming (LP) that can solve
to the high levels of accuracy that are expected in traditional LP applications. In addition, it …
to the high levels of accuracy that are expected in traditional LP applications. In addition, it …
Proximal splitting algorithms for convex optimization: A tour of recent advances, with new twists
Convex nonsmooth optimization problems, whose solutions live in very high dimensional
spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as …
spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as …
Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods
We design accelerated algorithms with improved rates for several fundamental classes of
optimization problems. Our algorithms all build upon techniques related to the analysis of …
optimization problems. Our algorithms all build upon techniques related to the analysis of …
Core Imaging Library-Part I: a versatile Python framework for tomographic imaging
JS Jørgensen, E Ametova, G Burca… - … of the Royal …, 2021 - royalsocietypublishing.org
We present the Core Imaging Library (CIL), an open-source Python framework for
tomographic imaging with particular emphasis on reconstruction of challenging datasets …
tomographic imaging with particular emphasis on reconstruction of challenging datasets …
An inexact accelerated stochastic ADMM for separable convex optimization
An inexact accelerated stochastic Alternating Direction Method of Multipliers (AS-ADMM)
scheme is developed for solving structured separable convex optimization problems with …
scheme is developed for solving structured separable convex optimization problems with …