Inexact variable metric stochastic block-coordinate descent for regularized optimization

C Lee, SJ Wright - Journal of Optimization Theory and Applications, 2020 - Springer
Block-coordinate descent is a popular framework for large-scale regularized optimization
problems with block-separable structure. Existing methods have several limitations. They …

Accelerated methods for distributed optimization

H Hendrikx - 2021 - theses.hal.science
In order to make meaningful predictions, modern machine learning models require huge
amounts of data, and are generally trained in a distributed way, ie, using many computing …

Distributed Machine Learning Framework: New Algorithms and Theoretical Foundation

Z Huo - 2020 - search.proquest.com
Abstract Machine learning is gaining fresh momentum, and has helped us to enhance not
only many industrial and professional processes but also our everyday living. The recent …

[PDF][PDF] Distributed Algorithms in Large-scaled Empirical Risk Minimization: Non-convexity, Adaptive Sampling, and Matrix-free Second-order Methods

X He - 2019 - core.ac.uk
Abstract 1 1 Dual Free Adaptive Mini-batch SDCA for Empirical Risk Minimization 3 1.1
Introduction..................................... 3 1.1. 1 Contributions................................ 5 1.1. 2 …