Manifold learning: What, how, and why
Manifold learning (ML), also known as nonlinear dimension reduction, is a set of methods to
find the low-dimensional structure of data. Dimension reduction for large, high-dimensional …
find the low-dimensional structure of data. Dimension reduction for large, high-dimensional …
Improved spectral convergence rates for graph Laplacians on ε-graphs and k-NN graphs
J Calder, NG Trillos - Applied and Computational Harmonic Analysis, 2022 - Elsevier
In this paper we improve the spectral convergence rates for graph-based approximations of
weighted Laplace-Beltrami operators constructed from random data. We utilize regularity of …
weighted Laplace-Beltrami operators constructed from random data. We utilize regularity of …
Lipschitz regularity of graph Laplacians on random data clouds
In this paper we study Lipschitz regularity of elliptic PDEs on geometric graphs, constructed
from random data points. The data points are sampled from a distribution supported on a …
from random data points. The data points are sampled from a distribution supported on a …
Asymptotic frequentist coverage properties of Bayesian credible sets for sieve priors
J Rousseau, B Szabo - The Annals of Statistics, 2020 - JSTOR
We investigate the frequentist coverage properties of (certain) Bayesian credible sets in a
general, adaptive, nonparametric framework. It is well known that the construction of …
general, adaptive, nonparametric framework. It is well known that the construction of …
Minimax optimal regression over sobolev spaces via laplacian regularization on neighborhood graphs
A Green, S Balakrishnan… - … Conference on Artificial …, 2021 - proceedings.mlr.press
In this paper we study the statistical properties of Laplacian smoothing, a graph-based
approach to nonparametric regression. Under standard regularity conditions, we establish …
approach to nonparametric regression. Under standard regularity conditions, we establish …
Bayesian inference in high-dimensional models
Models with dimension more than the available sample size are now commonly used in
various applications. A sensible inference is possible using a lower-dimensional structure. In …
various applications. A sensible inference is possible using a lower-dimensional structure. In …
Minimax optimal regression over Sobolev spaces via Laplacian Eigenmaps on neighbourhood graphs
A Green, S Balakrishnan… - Information and Inference …, 2023 - academic.oup.com
In this paper, we study the statistical properties of Principal Components Regression with
Laplacian Eigenmaps (PCR-LE), a method for non-parametric regression based on …
Laplacian Eigenmaps (PCR-LE), a method for non-parametric regression based on …
Bayesian spiked Laplacian graphs
LL Duan, G Michailidis, M Ding - Journal of Machine Learning Research, 2023 - jmlr.org
In network analysis, it is common to work with a collection of graphs that exhibit
heterogeneity. For example, neuroimaging data from patient cohorts are increasingly …
heterogeneity. For example, neuroimaging data from patient cohorts are increasingly …
Posterior consistency of semi-supervised regression on graphs
Graph-based semi-supervised regression (SSR) involves estimating the value of a function
on a weighted graph from its values (labels) on a small subset of the vertices; it can be …
on a weighted graph from its values (labels) on a small subset of the vertices; it can be …
A maximum principle argument for the uniform convergence of graph Laplacian regressors
N Garcia Trillos, RW Murray - SIAM Journal on Mathematics of Data Science, 2020 - SIAM
This paper investigates the use of methods from partial differential equations and the
calculus of variations to study learning problems that are regularized using graph …
calculus of variations to study learning problems that are regularized using graph …