Module-wise training of neural networks via the minimizing movement scheme

S Karkar, I Ayed, E de Bézenac… - Advances in Neural …, 2024 - proceedings.neurips.cc
Greedy layer-wise or module-wise training of neural networks is compelling in constrained
and on-device settings where memory is limited, as it circumvents a number of problems of …

Adaptable Hamiltonian neural networks

CD Han, B Glaz, M Haile, YC Lai - Physical Review Research, 2021 - APS
The rapid growth of research in exploiting machine learning to predict chaotic systems has
revived a recent interest in Hamiltonian neural networks (HNNs) with physical constraints …

Mapping conditional distributions for domain adaptation under generalized target shift

M Kirchmeyer, A Rakotomamonjy… - arXiv preprint arXiv …, 2021 - arxiv.org
We consider the problem of unsupervised domain adaptation (UDA) between a source and
a target domain under conditional and label shift aka Generalized Target Shift (GeTarS) …

A neuronal least-action principle for real-time learning in cortical circuits

W Senn, D Dold, AF Kungl, B Ellenberger, J Jordan… - BioRxiv, 2023 - biorxiv.org
One of the most fundamental laws of physics is the principle of least action. Motivated by its
predictive power, we introduce a neuronal least-action principle for cortical processing of …

Turning Normalizing Flows into Monge Maps with Geodesic Gaussian Preserving Flows

G Morel, L Drumetz, S Benaïchouche, N Courty… - arXiv preprint arXiv …, 2022 - arxiv.org
Normalizing Flows (NF) are powerful likelihood-based generative models that are able to
trade off between expressivity and tractability to model complex densities. A now well …

Adversarial sample detection through neural network transport dynamics

S Karkar, P Gallinari, A Rakotomamonjy - Joint European Conference on …, 2023 - Springer
We propose a detector of adversarial samples that is based on the view of neural networks
as discrete dynamic systems. The detector tells clean inputs from abnormal ones by …

LaCoOT: Layer Collapse through Optimal Transport

V Quétu, N Hezbri, E Tartaglione - arXiv preprint arXiv:2406.08933, 2024 - arxiv.org
Although deep neural networks are well-known for their remarkable performance in tackling
complex tasks, their hunger for computational resources remains a significant hurdle, posing …

Out-of-distribution Generalization in Deep Learning: Classification and Spatiotemporal Forecasting

M Kirchmeyer - 2023 - theses.hal.science
Deep learning has emerged as a powerful approach for modelling static data like images
and more recently for modelling dynamical systems like those underlying times series …

Block-wise Training of Residual Networks via the Minimizing Movement Scheme

S Karkar, I Ayed, E de Bézenac… - … Learning in the Wild at 26th …, 2022 - hal.science
End-to-end backpropagation has a few shortcomings: it requires loading the entire model
during training, which can be impossible in constrained settings, and suffers from three …

Deep learning based physical-statistics modeling of ocean dynamics

M Déchelle-Marquet - 2023 - theses.hal.science
The modeling of dynamical phenomena in geophysics and climate is based on a deep
understanding of the underlying physics, described in the form of PDEs, and on their …