CoCoA: A general framework for communication-efficient distributed optimization

V Smith, S Forte, C Ma, M Takáč, MI Jordan… - Journal of Machine …, 2018 - jmlr.org
The scale of modern datasets necessitates the development of efficient distributed
optimization methods for machine learning. We present a general-purpose framework for …

Solving empirical risk minimization in the current matrix multiplication time

YT Lee, Z Song, Q Zhang - Conference on Learning Theory, 2019 - proceedings.mlr.press
Many convex problems in machine learning and computer science share the same
form:\begin {align*}\min_ {x}\sum_ {i} f_i (A_i x+ b_i),\end {align*} where $ f_i $ are convex …

Giant: Globally improved approximate newton method for distributed optimization

S Wang, F Roosta, P Xu… - Advances in Neural …, 2018 - proceedings.neurips.cc
For distributed computing environment, we consider the empirical risk minimization problem
and propose a distributed and communication-efficient Newton-type optimization method. At …

Inexact successive quadratic approximation for regularized optimization

C Lee, SJ Wright - Computational Optimization and Applications, 2019 - Springer
Successive quadratic approximations, or second-order proximal methods, are useful for
minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part …

A distributed second-order algorithm you can trust

C Dünner, A Lucchi, M Gargiani… - International …, 2018 - proceedings.mlr.press
Due to the rapid growth of data and computational resources, distributed optimization has
become an active research area in recent years. While first-order methods seem to dominate …

Why dataset properties bound the scalability of parallel machine learning training algorithms

D Cheng, S Li, H Zhang, F Xia… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
As the training dataset size and the model size of machine learning increase rapidly, more
computing resources are consumed to speedup the training process. However, the …

Federated Empirical Risk Minimization via Second-Order Method

S Bian, Z Song, J Yin - arXiv preprint arXiv:2305.17482, 2023 - arxiv.org
Many convex optimization problems with important applications in machine learning are
formulated as empirical risk minimization (ERM). There are several examples: linear and …

L1-regularized distributed optimization: A communication-efficient primal-dual framework

V Smith, S Forte, MI Jordan, M Jaggi - arXiv preprint arXiv:1512.04011, 2015 - arxiv.org
Despite the importance of sparsity in many large-scale applications, there are few methods
for distributed optimization of sparsity-inducing objectives. In this paper, we present a …

Distributed block-diagonal approximation methods for regularized empirical risk minimization

C Lee, KW Chang - Machine Learning, 2020 - Springer
In recent years, there is a growing need to train machine learning models on a huge volume
of data. Therefore, designing efficient distributed optimization algorithms for empirical risk …

Manifold identification for ultimately communication-efficient distributed optimization

YS Li, WL Chiang, C Lee - International Conference on …, 2020 - proceedings.mlr.press
This work proposes a progressive manifold identification approach for distributed
optimization with sound theoretical justifications to greatly reduce both the rounds of …