Cooperative fixed-time/finite-time distributed robust optimization of multi-agent systems

M Firouzbahrami, A Nobakhti - Automatica, 2022 - Elsevier
A new robust continuous-time optimization algorithm for distributed problems is presented
which guarantees fixed-time convergence. The algorithm is based on a Lyapunov function …

Aflguard: Byzantine-robust asynchronous federated learning

M Fang, J Liu, NZ Gong, ES Bentley - Proceedings of the 38th Annual …, 2022 - dl.acm.org
Federated learning (FL) is an emerging machine learning paradigm, in which clients jointly
learn a model with the help of a cloud server. A fundamental challenge of FL is that the …

Online distributed nonconvex optimization with stochastic objective functions: High probability bound analysis of dynamic regrets

H Xu, K Lu, YL Wang - Automatica, 2024 - Elsevier
In this paper, the problem of online distributed optimization with stochastic and nonconvex
objective functions is studied by employing a multi-agent system. When making decisions …

SF-CABD: Secure Byzantine fault tolerance federated learning on Non-IID data

X Lin, Y Li, X Xie, Y Ding, X Wu, C Ge - Knowledge-Based Systems, 2024 - Elsevier
Federated learning facilitates collaborative learning among multiple parties while ensuring
client privacy. The vulnerability of federated learning to diverse Byzantine attacks stems from …

Secure distributed optimization under gradient attacks

S Yu, S Kar - IEEE Transactions on Signal Processing, 2023 - ieeexplore.ieee.org
In this article, we study secure distributed optimization against arbitrary gradient attacks in
multi-agent networks. In distributed optimization, there is no central server to coordinate …

Online distributed optimization with strongly pseudoconvex-sum cost functions and coupled inequality constraints

K Lu, H Xu - Automatica, 2023 - Elsevier
In this paper, the problem of online distributed optimization with coupled inequality
constraints is studied by employing multi-agent systems. Each agent only has access to the …

Byzantine-resilient Federated Learning Employing Normalized Gradients on Non-IID Datasets

S Zuo, X Yan, R Fan, L Shen, P Zhao, J Xu… - arXiv preprint arXiv …, 2024 - arxiv.org
In practical federated learning (FL) systems, the presence of malicious Byzantine attacks
and data heterogeneity often introduces biases into the learning process. However, existing …

Distributed Active Client Selection With Noisy Clients Using Model Association Scores

KI Kim - European Conference on Computer Vision, 2025 - Springer
Active client selection (ACS) strategically identifies clients for model updates during each
training round of federated learning. In scenarios with limited communication resources …

Communication-efficient federated learning using censored heavy ball descent

Y Chen, RS Blum, BM Sadler - IEEE Transactions on Signal …, 2022 - ieeexplore.ieee.org
Distributed machine learning enables scalability and computational offloading, but requires
significant levels of communication. Consequently, communication efficiency in distributed …

Online Optimization Under Randomly Corrupted Attacks

Z Qu, X Li, L Li, X Yi - IEEE Transactions on Signal Processing, 2024 - ieeexplore.ieee.org
Existing algorithms in online optimization usually rely on trustful information, eg, reliable
knowledge of gradients, which makes them vulnerable to attacks. To take into account the …