Towards convergence rate analysis of random forests for classification
Random forests have been one of the successful ensemble algorithms in machine learning,
and the basic idea is to construct a large number of random trees individually and make …
and the basic idea is to construct a large number of random trees individually and make …
[PDF][PDF] Adaptivity to noise parameters in nonparametric active learning
A Locatelli, A Carpentier… - Proceedings of the 2017 …, 2017 - proceedings.mlr.press
Adaptivity to Noise Parameters in Nonparametric Active Learning Page 1 Proceedings of
Machine Learning Research vol 65:1–34, 2017 Adaptivity to Noise Parameters in …
Machine Learning Research vol 65:1–34, 2017 Adaptivity to Noise Parameters in …
Recovery guarantees for polynomial coefficients from weakly dependent data with outliers
Learning non-linear systems from noisy, limited, and/or dependent data is an important task
across various scientific fields including statistics, engineering, computer science …
across various scientific fields including statistics, engineering, computer science …
Fast learning rates with heavy-tailed losses
We study fast learning rates when the losses are not necessarily bounded and may have a
distribution with heavy tails. To enable such analyses, we introduce two new conditions:(i) …
distribution with heavy tails. To enable such analyses, we introduce two new conditions:(i) …
A generalization bound of deep neural networks for dependent data
Existing generalization bounds for deep neural networks require data to be independent
and identically distributed (iid). This assumption may not hold in real-life applications such …
and identically distributed (iid). This assumption may not hold in real-life applications such …
Interclass interference suppression in multi-class problems
J Liu, M Bai, N Jiang, R Cheng, X Li, Y Wang, D Yu - Applied Sciences, 2021 - mdpi.com
Multi-classifiers are widely applied in many practical problems. But the features that can
significantly discriminate a certain class from others are often deleted in the feature selection …
significantly discriminate a certain class from others are often deleted in the feature selection …
A Generalization Bound of Deep Neural Networks for Dependent Data
QH Do, BT Nguyen, LST Ho - arXiv preprint arXiv:2310.05892, 2023 - arxiv.org
Existing generalization bounds for deep neural networks require data to be independent
and identically distributed (iid). This assumption may not hold in real-life applications such …
and identically distributed (iid). This assumption may not hold in real-life applications such …
An adaptive multiclass nearest neighbor classifier
N Puchkin, V Spokoiny - ESAIM: Probability and Statistics, 2020 - esaim-ps.org
We consider a problem of multiclass classification, where the training sample S_n={(X i, Y i)}
ni= 1 is generated from the model ℙ (Y= m| X= x)= η m (x), 1≤ m≤ M, and η 1 (x),…, η M (x) …
ni= 1 is generated from the model ℙ (Y= m| X= x)= η m (x), 1≤ m≤ M, and η 1 (x),…, η M (x) …
Joint Dataset Reconstruction and Power Control for Distributed Training in D2D Edge Network
The intrinsic nature of non-independent and identically distributed datasets on
heterogeneous devices slows down the distributed model training process and reduces the …
heterogeneous devices slows down the distributed model training process and reduces the …
Adaptive group Lasso neural network models for functions of few variables and time-dependent data
Learning nonlinear functions from time-varying measurements is always difficult due to the
high correlation among observations. This task is more challenging when the target function …
high correlation among observations. This task is more challenging when the target function …