Distributed artificial intelligence empowered by end-edge-cloud computing: A survey
As the computing paradigm shifts from cloud computing to end-edge-cloud computing, it
also supports artificial intelligence evolving from a centralized manner to a distributed one …
also supports artificial intelligence evolving from a centralized manner to a distributed one …
Sustainable ai: Environmental implications, challenges and opportunities
CJ Wu, R Raghavendra, U Gupta… - Proceedings of …, 2022 - proceedings.mlsys.org
This paper explores the environmental impact of the super-linear growth trends for AI from a
holistic perspective, spanning Data, Algorithms, and System Hardware. We characterize the …
holistic perspective, spanning Data, Algorithms, and System Hardware. We characterize the …
High-resolution de novo structure prediction from primary sequence
Recent breakthroughs have used deep learning to exploit evolutionary information in
multiple sequence alignments (MSAs) to accurately predict protein structures. However …
multiple sequence alignments (MSAs) to accurately predict protein structures. However …
Zero-shot text-to-image generation
Text-to-image generation has traditionally focused on finding better modeling assumptions
for training on a fixed dataset. These assumptions might involve complex architectures …
for training on a fixed dataset. These assumptions might involve complex architectures …
Cocktailsgd: Fine-tuning foundation models over 500mbps networks
Distributed training of foundation models, especially large language models (LLMs), is
communication-intensive and so has heavily relied on centralized data centers with fast …
communication-intensive and so has heavily relied on centralized data centers with fast …
Communication-efficient federated learning
Federated learning (FL) enables edge devices, such as Internet of Things devices (eg,
sensors), servers, and institutions (eg, hospitals), to collaboratively train a machine learning …
sensors), servers, and institutions (eg, hospitals), to collaboratively train a machine learning …
Advances and open problems in federated learning
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …
devices or whole organizations) collaboratively train a model under the orchestration of a …
Sharper convergence guarantees for asynchronous SGD for distributed and federated learning
We study the asynchronous stochastic gradient descent algorithm, for distributed training
over $ n $ workers that might be heterogeneous. In this algorithm, workers compute …
over $ n $ workers that might be heterogeneous. In this algorithm, workers compute …
Mime: Mimicking centralized stochastic algorithms in federated learning
Federated learning (FL) is a challenging setting for optimization due to the heterogeneity of
the data across different clients which gives rise to the client drift phenomenon. In fact …
the data across different clients which gives rise to the client drift phenomenon. In fact …
Optimal client sampling for federated learning
It is well understood that client-master communication can be a primary bottleneck in
Federated Learning. In this work, we address this issue with a novel client subsampling …
Federated Learning. In this work, we address this issue with a novel client subsampling …