CoCoA: A general framework for communication-efficient distributed optimization
The scale of modern datasets necessitates the development of efficient distributed
optimization methods for machine learning. We present a general-purpose framework for …
optimization methods for machine learning. We present a general-purpose framework for …
Solving empirical risk minimization in the current matrix multiplication time
Many convex problems in machine learning and computer science share the same
form:\begin {align*}\min_ {x}\sum_ {i} f_i (A_i x+ b_i),\end {align*} where $ f_i $ are convex …
form:\begin {align*}\min_ {x}\sum_ {i} f_i (A_i x+ b_i),\end {align*} where $ f_i $ are convex …
Giant: Globally improved approximate newton method for distributed optimization
For distributed computing environment, we consider the empirical risk minimization problem
and propose a distributed and communication-efficient Newton-type optimization method. At …
and propose a distributed and communication-efficient Newton-type optimization method. At …
Inexact successive quadratic approximation for regularized optimization
Successive quadratic approximations, or second-order proximal methods, are useful for
minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part …
minimizing functions that are a sum of a smooth part and a convex, possibly nonsmooth part …
A distributed second-order algorithm you can trust
Due to the rapid growth of data and computational resources, distributed optimization has
become an active research area in recent years. While first-order methods seem to dominate …
become an active research area in recent years. While first-order methods seem to dominate …
Why dataset properties bound the scalability of parallel machine learning training algorithms
As the training dataset size and the model size of machine learning increase rapidly, more
computing resources are consumed to speedup the training process. However, the …
computing resources are consumed to speedup the training process. However, the …
Federated Empirical Risk Minimization via Second-Order Method
Many convex optimization problems with important applications in machine learning are
formulated as empirical risk minimization (ERM). There are several examples: linear and …
formulated as empirical risk minimization (ERM). There are several examples: linear and …
L1-regularized distributed optimization: A communication-efficient primal-dual framework
Despite the importance of sparsity in many large-scale applications, there are few methods
for distributed optimization of sparsity-inducing objectives. In this paper, we present a …
for distributed optimization of sparsity-inducing objectives. In this paper, we present a …
Distributed block-diagonal approximation methods for regularized empirical risk minimization
In recent years, there is a growing need to train machine learning models on a huge volume
of data. Therefore, designing efficient distributed optimization algorithms for empirical risk …
of data. Therefore, designing efficient distributed optimization algorithms for empirical risk …
Manifold identification for ultimately communication-efficient distributed optimization
This work proposes a progressive manifold identification approach for distributed
optimization with sound theoretical justifications to greatly reduce both the rounds of …
optimization with sound theoretical justifications to greatly reduce both the rounds of …