Riemannian flow matching on general geometries

RTQ Chen, Y Lipman - arXiv preprint arXiv:2302.03660, 2023 - arxiv.org
We propose Riemannian Flow Matching (RFM), a simple yet powerful framework for training
continuous normalizing flows on manifolds. Existing methods for generative modeling on …

Deep laplacian-based options for temporally-extended exploration

M Klissarov, MC Machado - arXiv preprint arXiv:2301.11181, 2023 - arxiv.org
Selecting exploratory actions that generate a rich stream of experience for better learning is
a fundamental challenge in reinforcement learning (RL). An approach to tackle this problem …

Contrastive learning is spectral clustering on similarity graph

Z Tan, Y Zhang, J Yang, Y Yuan - arXiv preprint arXiv:2303.15103, 2023 - arxiv.org
Contrastive learning is a powerful self-supervised learning method, but we have a limited
theoretical understanding of how it works and why it works. In this paper, we prove that …

Learning neural eigenfunctions for unsupervised semantic segmentation

Z Deng, Y Luo - Proceedings of the IEEE/CVF International …, 2023 - openaccess.thecvf.com
Unsupervised semantic segmentation is a long-standing challenge in computer vision with
great significance. Spectral clustering is a theoretically grounded solution to it where the …

Improved operator learning by orthogonal attention

Z Xiao, Z Hao, B Lin, Z Deng, H Su - arXiv preprint arXiv:2310.12487, 2023 - arxiv.org
Neural operators, as an efficient surrogate model for learning the solutions of PDEs, have
received extensive attention in the field of scientific machine learning. Among them, attention …

The edge of orthogonality: A simple view of what makes byol tick

PH Richemond, A Tam, Y Tang… - International …, 2023 - proceedings.mlr.press
Self-predictive unsupervised learning methods such as BYOL or SimSIAM have shown
impressive results, and counter-intuitively, do not collapse to trivial representations. In this …

Neural harmonics: bridging spectral embedding and matrix completion in self-supervised learning

M Munkhoeva, I Oseledets - Advances in Neural …, 2024 - proceedings.neurips.cc
Self-supervised methods received tremendous attention thanks to their seemingly heuristic
approach to learning representations that respect the semantics of the data without any …

Contrastive Learning as Kernel Approximation

KC Tsiolis - arXiv preprint arXiv:2309.02651, 2023 - arxiv.org
In standard supervised machine learning, it is necessary to provide a label for every input in
the data. While raw data in many application domains is easily obtainable on the Internet …

Spectral representation learning for conditional moment models

Z Wang, Y Luo, Y Li, J Zhu, B Schölkopf - arXiv preprint arXiv:2210.16525, 2022 - arxiv.org
Many problems in causal inference and economics can be formulated in the framework of
conditional moment models, which characterize the target function through a collection of …

Addressing Sample Inefficiency in Multi-View Representation Learning

KK Agrawal, A Ghosh, A Oberman… - arXiv preprint arXiv …, 2023 - arxiv.org
Non-contrastive self-supervised learning (NC-SSL) methods like BarlowTwins and VICReg
have shown great promise for label-free representation learning in computer vision. Despite …