Bayesian statistics and modelling
Bayesian statistics is an approach to data analysis based on Bayes' theorem, where
available knowledge about parameters in a statistical model is updated with the information …
available knowledge about parameters in a statistical model is updated with the information …
Priors in bayesian deep learning: A review
V Fortuin - International Statistical Review, 2022 - Wiley Online Library
While the choice of prior is one of the most critical parts of the Bayesian inference workflow,
recent Bayesian deep learning models have often fallen back on vague priors, such as …
recent Bayesian deep learning models have often fallen back on vague priors, such as …
What can transformers learn in-context? a case study of simple function classes
In-context learning is the ability of a model to condition on a prompt sequence consisting of
in-context examples (input-output pairs corresponding to some task) along with a new query …
in-context examples (input-output pairs corresponding to some task) along with a new query …
Diffusion with forward models: Solving stochastic inverse problems without direct supervision
Denoising diffusion models are a powerful type of generative models used to capture
complex distributions of real-world signals. However, their applicability is limited to …
complex distributions of real-world signals. However, their applicability is limited to …
Implicit neural representations with periodic activation functions
Implicitly defined, continuous, differentiable signal representations parameterized by neural
networks have emerged as a powerful paradigm, offering many possible benefits over …
networks have emerged as a powerful paradigm, offering many possible benefits over …
From data to functa: Your data point is a function and you can treat it like one
It is common practice in deep learning to represent a measurement of the world on a
discrete grid, eg a 2D grid of pixels. However, the underlying signal represented by these …
discrete grid, eg a 2D grid of pixels. However, the underlying signal represented by these …
Set transformer: A framework for attention-based permutation-invariant neural networks
Many machine learning tasks such as multiple instance learning, 3D shape recognition, and
few-shot image classification are defined on sets of instances. Since solutions to such …
few-shot image classification are defined on sets of instances. Since solutions to such …
Card: Classification and regression diffusion models
Learning the distribution of a continuous or categorical response variable y given its
covariates x is a fundamental problem in statistics and machine learning. Deep neural …
covariates x is a fundamental problem in statistics and machine learning. Deep neural …
Transformer neural processes: Uncertainty-aware meta learning via sequence modeling
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …
Metasdf: Meta-learning signed distance functions
Neural implicit shape representations are an emerging paradigm that offers many potential
benefits over conventional discrete representations, including memory efficiency at a high …
benefits over conventional discrete representations, including memory efficiency at a high …