From Bricks to Bridges: Product of Invariances to Enhance Latent Space Communication
It has been observed that representations learned by distinct neural networks conceal
structural similarities when the models are trained under similar inductive biases. From a …
structural similarities when the models are trained under similar inductive biases. From a …
Latent. Functional Map
Neural models learn data representations that lie on low-dimensional manifolds, yet
modeling the relation between these representational spaces is an ongoing challenge. By …
modeling the relation between these representational spaces is an ongoing challenge. By …
Scalable unsupervised alignment of general metric and non-metric structures
Aligning data from different domains is a fundamental problem in machine learning with
broad applications across very different areas, most notably aligning experimental readouts …
broad applications across very different areas, most notably aligning experimental readouts …
Intrinsic Dimension Correlation: uncovering nonlinear connections in multimodal representations
To gain insight into the mechanisms behind machine learning methods, it is crucial to
establish connections among the features describing data points. However, these …
establish connections among the features describing data points. However, these …
On the direct alignment of latent spaces
With the wide adaption of deep learning and pre-trained models rises the question of how to
effectively reuse existing latent spaces for new applications. One important question is how …
effectively reuse existing latent spaces for new applications. One important question is how …