Artificial neural networks for photonic applications—from algorithms to implementation: tutorial
This tutorial–review on applications of artificial neural networks in photonics targets a broad
audience, ranging from optical research and engineering communities to computer science …
audience, ranging from optical research and engineering communities to computer science …
Deep learning and generative methods in cheminformatics and chemical biology: navigating small molecule space intelligently
The number of 'small'molecules that may be of interest to chemical biologists—chemical
space—is enormous, but the fraction that have ever been made is tiny. Most strategies are …
space—is enormous, but the fraction that have ever been made is tiny. Most strategies are …
Evaluation of neural architectures trained with square loss vs cross-entropy in classification tasks
Modern neural architectures for classification tasks are trained using the cross-entropy loss,
which is widely believed to be empirically superior to the square loss. In this work we …
which is widely believed to be empirically superior to the square loss. In this work we …
Latent space oddity: on the curvature of deep generative models
Deep generative models provide a systematic way to learn nonlinear data distributions,
through a set of latent variables and a nonlinear" generator" function that maps latent points …
through a set of latent variables and a nonlinear" generator" function that maps latent points …
Classification vs regression in overparameterized regimes: Does the loss function matter?
We compare classification and regression tasks in an overparameterized linear model with
Gaussian features. On the one hand, we show that with sufficient overparameterization all …
Gaussian features. On the one hand, we show that with sufficient overparameterization all …
Understanding neural networks with reproducing kernel Banach spaces
Characterizing the function spaces corresponding to neural networks can provide a way to
understand their properties. In this paper we discuss how the theory of reproducing kernel …
understand their properties. In this paper we discuss how the theory of reproducing kernel …
Geometrically enriched latent spaces
A common assumption in generative models is that the generator immerses the latent space
into a Euclidean ambient space. Instead, we consider the ambient space to be a …
into a Euclidean ambient space. Instead, we consider the ambient space to be a …
Toward large kernel models
A Abedsoltan, M Belkin… - … Conference on Machine …, 2023 - proceedings.mlr.press
Recent studies indicate that kernel machines can often perform similarly or better than deep
neural networks (DNNs) on small datasets. The interest in kernel machines has been …
neural networks (DNNs) on small datasets. The interest in kernel machines has been …
Domain adaptation by joint distribution invariant projections
Domain adaptation addresses the learning problem where the training data are sampled
from a source joint distribution (source domain), while the test data are sampled from a …
from a source joint distribution (source domain), while the test data are sampled from a …
Domain generalization by joint-product distribution alignment
In this work, we address the problem of domain generalization for classification, where the
goal is to learn a classification model on a set of source domains and generalize it to a target …
goal is to learn a classification model on a set of source domains and generalize it to a target …