Artificial neural networks for photonic applications—from algorithms to implementation: tutorial

P Freire, E Manuylovich, JE Prilepsky… - Advances in Optics and …, 2023 - opg.optica.org
This tutorial–review on applications of artificial neural networks in photonics targets a broad
audience, ranging from optical research and engineering communities to computer science …

Deep learning and generative methods in cheminformatics and chemical biology: navigating small molecule space intelligently

DB Kell, S Samanta, N Swainston - Biochemical Journal, 2020 - portlandpress.com
The number of 'small'molecules that may be of interest to chemical biologists—chemical
space—is enormous, but the fraction that have ever been made is tiny. Most strategies are …

Evaluation of neural architectures trained with square loss vs cross-entropy in classification tasks

L Hui, M Belkin - arXiv preprint arXiv:2006.07322, 2020 - arxiv.org
Modern neural architectures for classification tasks are trained using the cross-entropy loss,
which is widely believed to be empirically superior to the square loss. In this work we …

Latent space oddity: on the curvature of deep generative models

G Arvanitidis, LK Hansen, S Hauberg - arXiv preprint arXiv:1710.11379, 2017 - arxiv.org
Deep generative models provide a systematic way to learn nonlinear data distributions,
through a set of latent variables and a nonlinear" generator" function that maps latent points …

Classification vs regression in overparameterized regimes: Does the loss function matter?

V Muthukumar, A Narang, V Subramanian… - Journal of Machine …, 2021 - jmlr.org
We compare classification and regression tasks in an overparameterized linear model with
Gaussian features. On the one hand, we show that with sufficient overparameterization all …

Understanding neural networks with reproducing kernel Banach spaces

F Bartolucci, E De Vito, L Rosasco… - Applied and Computational …, 2023 - Elsevier
Characterizing the function spaces corresponding to neural networks can provide a way to
understand their properties. In this paper we discuss how the theory of reproducing kernel …

Geometrically enriched latent spaces

G Arvanitidis, S Hauberg, B Schölkopf - arXiv preprint arXiv:2008.00565, 2020 - arxiv.org
A common assumption in generative models is that the generator immerses the latent space
into a Euclidean ambient space. Instead, we consider the ambient space to be a …

Toward large kernel models

A Abedsoltan, M Belkin… - … Conference on Machine …, 2023 - proceedings.mlr.press
Recent studies indicate that kernel machines can often perform similarly or better than deep
neural networks (DNNs) on small datasets. The interest in kernel machines has been …

Domain adaptation by joint distribution invariant projections

S Chen, M Harandi, X Jin… - IEEE transactions on image …, 2020 - ieeexplore.ieee.org
Domain adaptation addresses the learning problem where the training data are sampled
from a source joint distribution (source domain), while the test data are sampled from a …

Domain generalization by joint-product distribution alignment

S Chen, L Wang, Z Hong, X Yang - Pattern Recognition, 2023 - Elsevier
In this work, we address the problem of domain generalization for classification, where the
goal is to learn a classification model on a set of source domains and generalize it to a target …