Diffusionnet: Discretization agnostic learning on surfaces
We introduce a new general-purpose approach to deep learning on three-dimensional
surfaces based on the insight that a simple diffusion layer is highly effective for spatial …
surfaces based on the insight that a simple diffusion layer is highly effective for spatial …
[HTML][HTML] Derivative reproducing properties for kernel methods in learning theory
DX Zhou - Journal of computational and Applied Mathematics, 2008 - Elsevier
The regularity of functions from reproducing kernel Hilbert spaces (RKHSs) is studied in the
setting of learning theory. We provide a reproducing property for partial derivatives up to …
setting of learning theory. We provide a reproducing property for partial derivatives up to …
[PDF][PDF] Modeling interactive components by coordinate kernel polynomial models
We proposed the use of coordinate kernel polynomials in kernel regression. This new
approach, called coordinate kernel polynomial regression, can simultaneously identify …
approach, called coordinate kernel polynomial regression, can simultaneously identify …
Agnostically learning multi-index models with queries
We study the power of query access for the fundamental task of agnostic learning under the
Gaussian distribution. In the agnostic model, no assumptions are made on the labels of the …
Gaussian distribution. In the agnostic model, no assumptions are made on the labels of the …
[HTML][HTML] Unregularized online learning algorithms with general loss functions
In this paper, we consider unregularized online learning algorithms in a Reproducing Kernel
Hilbert Space (RKHS). Firstly, we derive explicit convergence rates of the unregularized …
Hilbert Space (RKHS). Firstly, we derive explicit convergence rates of the unregularized …
Stability and differential privacy of stochastic gradient descent for pairwise learning with non-smooth loss
Pairwise learning has recently received increasing attention since it subsumes many
important machine learning tasks (eg AUC maximization and metric learning) into a unifying …
important machine learning tasks (eg AUC maximization and metric learning) into a unifying …
DNNR: Differential nearest neighbors regression
K-nearest neighbors (KNN) is one of the earliest and most established algorithms in
machine learning. For regression tasks, KNN averages the targets within a neighborhood …
machine learning. For regression tasks, KNN averages the targets within a neighborhood …
Online pairwise learning algorithms
Pairwise learning usually refers to a learning task that involves a loss function depending on
pairs of examples, among which the most notable ones are bipartite ranking, metric learning …
pairs of examples, among which the most notable ones are bipartite ranking, metric learning …
Modeling cancer progression via pathway dependencies
Cancer is a heterogeneous disease often requiring a complexity of alterations to drive a
normal cell to a malignancy and ultimately to a metastatic state. Certain genetic …
normal cell to a malignancy and ultimately to a metastatic state. Certain genetic …
Approximating gradients for meshes and point clouds via diffusion metric
The gradient of a function defined on a manifold is perhaps one of the most important
differential objects in data analysis. Most often in practice, the input function is available only …
differential objects in data analysis. Most often in practice, the input function is available only …