A review of distributed statistical inference

Y Gao, W Liu, H Wang, X Wang, Y Yan… - Statistical Theory and …, 2022 - Taylor & Francis
The rapid emergence of massive datasets in various fields poses a serious challenge to
traditional statistical methods. Meanwhile, it provides opportunities for researchers to …

Learning coefficient heterogeneity over networks: A distributed spanning-tree-based fused-lasso regression

X Zhang, J Liu, Z Zhu - Journal of the American Statistical …, 2024 - Taylor & Francis
Identifying the latent cluster structure based on model heterogeneity is a fundamental but
challenging task arises in many machine learning applications. In this article, we study the …

New scalable and efficient online pairwise learning algorithm

B Gu, R Bao, C Zhang, H Huang - IEEE Transactions on Neural …, 2023 - ieeexplore.ieee.org
Pairwise learning is an important machine-learning topic with many practical applications.
An online algorithm is the first choice for processing streaming data and is preferred for …

Optimal convergence rates for distributed Nyström approximation

J Li, Y Liu, W Wang - Journal of Machine Learning Research, 2023 - jmlr.org
The distributed kernel ridge regression (DKRR) has shown great potential in processing
complicated tasks. However, DKRR only made use of the local samples that failed to capture …

Optimal convergence rates for agnostic Nyström kernel learning

J Li, Y Liu, W Wang - International Conference on Machine …, 2023 - proceedings.mlr.press
Nyström low-rank approximation has shown great potential in processing large-scale kernel
matrix and neural networks. However, there lacks a unified analysis for Nyström …

Towards understanding ensemble distillation in federated learning

S Park, K Hong, G Hwang - International Conference on …, 2023 - proceedings.mlr.press
Federated Learning (FL) is a collaborative machine learning paradigm for data privacy
preservation. Recently, a knowledge distillation (KD) based information sharing approach in …

Effective distributed learning with random features: Improved bounds and algorithms

Y Liu, J Liu, S Wang - International Conference on Learning …, 2021 - openreview.net
In this paper, we study the statistical properties of distributed kernel ridge regression
together with random features (DKRR-RF), and obtain optimal generalization bounds under …

Distributed nyström kernel learning with communications

R Yin, W Wang, D Meng - International Conference on …, 2021 - proceedings.mlr.press
We study the statistical performance for distributed kernel ridge regression with Nyström
(DKRR-NY) and with Nyström and iterative solvers (DKRR-NY-PCG) and successfully derive …

Optimal Rates for Agnostic Distributed Learning

J Li, Y Liu, W Wang - IEEE Transactions on Information Theory, 2023 - ieeexplore.ieee.org
The existing optimal rates for distributed kernel ridge regression (DKRR) often rely on a strict
assumption, assuming that the true concept belongs to the hypothesis space. However …

Communication-Efficient Nonparametric Quantile Regression via Random Features

C Wang, T Li, X Zhang, X Feng, X He - Journal of Computational …, 2024 - Taylor & Francis
This article introduces a refined algorithm designed for distributed nonparametric quantile
regression in a reproducing kernel Hilbert space (RKHS). Unlike existing nonparametric …