Distributed optimization with arbitrary local solvers

C Ma, J Konečný, M Jaggi, V Smith… - optimization Methods …, 2017 - Taylor & Francis
With the growth of data and necessity for distributed optimization methods, solvers that work
well on a single machine must be re-designed to leverage distributed computation. Recent …

Distributed coordinate descent method for learning with big data

P Richtárik, M Takáč - Journal of Machine Learning Research, 2016 - jmlr.org
In this paper we develop and analyze Hydra: HYbriD cooRdinAte descent method for
solving loss minimization problems with big data. We initially partition the coordinates …

A decentralized second-order method with exact linear convergence rate for consensus optimization

A Mokhtari, W Shi, Q Ling… - IEEE Transactions on …, 2016 - ieeexplore.ieee.org
This paper considers decentralized consensus optimization problems where different
summands of a global objective function are available at nodes of a network that can …

Coordinate descent with arbitrary sampling I: Algorithms and complexity

Z Qu, P Richtárik - Optimization Methods and Software, 2016 - Taylor & Francis
We study the problem of minimizing the sum of a smooth convex function and a convex
block-separable regularizer and propose a new randomized coordinate descent method …

Quartz: Randomized dual coordinate ascent with arbitrary sampling

Z Qu, P Richtárik, T Zhang - Advances in neural information …, 2015 - proceedings.neurips.cc
We study the problem of minimizing the average of a large number of smooth convex
functions penalized with a strongly convex regularizer. We propose and analyze a novel …

SDNA: Stochastic dual Newton ascent for empirical risk minimization

Z Qu, P Richtárik, M Takác… - … Conference on Machine …, 2016 - proceedings.mlr.press
We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual
Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random …

On optimal probabilities in stochastic coordinate descent methods

P Richtárik, M Takáč - Optimization Letters, 2016 - Springer
We propose and analyze a new parallel coordinate descent method—NSync—in which at
each iteration a random subset of coordinates is updated, in parallel, allowing for the …

Stochastic dual coordinate ascent with adaptive probabilities

D Csiba, Z Qu, P Richtárik - International Conference on …, 2015 - proceedings.mlr.press
This paper introduces AdaSDCA: an adaptive variant of stochastic dual coordinate ascent
(SDCA) for solving the regularized empirical risk minimization problems. Our modification …

Applications of Lagrangian relaxation-based algorithms to industrial scheduling problems, especially in production workshop scenarios: A review

L Sun, R Yang, J Feng, G Guo - Journal of Process Control, 2024 - Elsevier
Industrial scheduling problems (ISPs), especially industrial production workshop scheduling
problems (IPWSPs) in various sectors like manufacturing, and power require allocating …

Coordinate descent with arbitrary sampling II: Expected separable overapproximation

Z Qu, P Richtárik - Optimization Methods and Software, 2016 - Taylor & Francis
The design and complexity analysis of randomized coordinate descent methods, and in
particular of variants which update a random subset (sampling) of coordinates in each …