SGD: General analysis and improved rates

RM Gower, N Loizou, X Qian… - International …, 2019 - proceedings.mlr.press
We propose a general yet simple theorem describing the convergence of SGD under the
arbitrary sampling paradigm. Our theorem describes the convergence of an infinite array of …

Intratomo: self-supervised learning-based tomography via sinogram synthesis and prediction

G Zang, R Idoughi, R Li, P Wonka… - Proceedings of the …, 2021 - openaccess.thecvf.com
We propose IntraTomo, a powerful framework that combines the benefits of learning-based
and model-based approaches for solving highly ill-posed inverse problems in the Computed …

Momentum and stochastic momentum for stochastic gradient, newton, proximal point and subspace descent methods

N Loizou, P Richtárik - Computational Optimization and Applications, 2020 - Springer
In this paper we study several classes of stochastic optimization algorithms enriched with
heavy ball momentum. Among the methods studied are: stochastic gradient descent …

Convex optimization algorithms in medical image reconstruction—in the age of AI

J Xu, F Noo - Physics in Medicine & Biology, 2022 - iopscience.iop.org
The past decade has seen the rapid growth of model based image reconstruction (MBIR)
algorithms, which are often applications or adaptations of convex optimization algorithms …

ProxSARAH: An efficient algorithmic framework for stochastic composite nonconvex optimization

NH Pham, LM Nguyen, DT Phan… - Journal of Machine …, 2020 - jmlr.org
We propose a new stochastic first-order algorithmic framework to solve stochastic composite
nonconvex optimization problems that covers both finite-sum and expectation settings. Our …

Practical large-scale linear programming using primal-dual hybrid gradient

D Applegate, M Díaz, O Hinder, H Lu… - Advances in …, 2021 - proceedings.neurips.cc
We present PDLP, a practical first-order method for linear programming (LP) that can solve
to the high levels of accuracy that are expected in traditional LP applications. In addition, it …

Proximal splitting algorithms for convex optimization: A tour of recent advances, with new twists

L Condat, D Kitahara, A Contreras, A Hirabayashi - SIAM Review, 2023 - SIAM
Convex nonsmooth optimization problems, whose solutions live in very high dimensional
spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as …

Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods

Y Jin, A Sidford, K Tian - Conference on Learning Theory, 2022 - proceedings.mlr.press
We design accelerated algorithms with improved rates for several fundamental classes of
optimization problems. Our algorithms all build upon techniques related to the analysis of …

Core Imaging Library-Part I: a versatile Python framework for tomographic imaging

JS Jørgensen, E Ametova, G Burca… - … of the Royal …, 2021 - royalsocietypublishing.org
We present the Core Imaging Library (CIL), an open-source Python framework for
tomographic imaging with particular emphasis on reconstruction of challenging datasets …

An inexact accelerated stochastic ADMM for separable convex optimization

J Bai, WW Hager, H Zhang - Computational Optimization and Applications, 2022 - Springer
An inexact accelerated stochastic Alternating Direction Method of Multipliers (AS-ADMM)
scheme is developed for solving structured separable convex optimization problems with …