Riemannian flow matching on general geometries
We propose Riemannian Flow Matching (RFM), a simple yet powerful framework for training
continuous normalizing flows on manifolds. Existing methods for generative modeling on …
continuous normalizing flows on manifolds. Existing methods for generative modeling on …
Deep laplacian-based options for temporally-extended exploration
M Klissarov, MC Machado - arXiv preprint arXiv:2301.11181, 2023 - arxiv.org
Selecting exploratory actions that generate a rich stream of experience for better learning is
a fundamental challenge in reinforcement learning (RL). An approach to tackle this problem …
a fundamental challenge in reinforcement learning (RL). An approach to tackle this problem …
Contrastive learning is spectral clustering on similarity graph
Contrastive learning is a powerful self-supervised learning method, but we have a limited
theoretical understanding of how it works and why it works. In this paper, we prove that …
theoretical understanding of how it works and why it works. In this paper, we prove that …
Learning neural eigenfunctions for unsupervised semantic segmentation
Unsupervised semantic segmentation is a long-standing challenge in computer vision with
great significance. Spectral clustering is a theoretically grounded solution to it where the …
great significance. Spectral clustering is a theoretically grounded solution to it where the …
Improved operator learning by orthogonal attention
Neural operators, as an efficient surrogate model for learning the solutions of PDEs, have
received extensive attention in the field of scientific machine learning. Among them, attention …
received extensive attention in the field of scientific machine learning. Among them, attention …
The edge of orthogonality: A simple view of what makes byol tick
Self-predictive unsupervised learning methods such as BYOL or SimSIAM have shown
impressive results, and counter-intuitively, do not collapse to trivial representations. In this …
impressive results, and counter-intuitively, do not collapse to trivial representations. In this …
Neural harmonics: bridging spectral embedding and matrix completion in self-supervised learning
M Munkhoeva, I Oseledets - Advances in Neural …, 2024 - proceedings.neurips.cc
Self-supervised methods received tremendous attention thanks to their seemingly heuristic
approach to learning representations that respect the semantics of the data without any …
approach to learning representations that respect the semantics of the data without any …
Contrastive Learning as Kernel Approximation
KC Tsiolis - arXiv preprint arXiv:2309.02651, 2023 - arxiv.org
In standard supervised machine learning, it is necessary to provide a label for every input in
the data. While raw data in many application domains is easily obtainable on the Internet …
the data. While raw data in many application domains is easily obtainable on the Internet …
Spectral representation learning for conditional moment models
Many problems in causal inference and economics can be formulated in the framework of
conditional moment models, which characterize the target function through a collection of …
conditional moment models, which characterize the target function through a collection of …
Addressing Sample Inefficiency in Multi-View Representation Learning
Non-contrastive self-supervised learning (NC-SSL) methods like BarlowTwins and VICReg
have shown great promise for label-free representation learning in computer vision. Despite …
have shown great promise for label-free representation learning in computer vision. Despite …