Transformer neural processes: Uncertainty-aware meta learning via sequence modeling
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …
Convolutional conditional neural processes
We introduce the Convolutional Conditional Neural Process (ConvCNP), a new member of
the Neural Process family that models translation equivariance in the data. Translation …
the Neural Process family that models translation equivariance in the data. Translation …
Meta-learning stationary stochastic process prediction with convolutional neural processes
Stationary stochastic processes (SPs) are a key component of many probabilistic models,
such as those for off-the-grid spatio-temporal data. They enable the statistical symmetry of …
such as those for off-the-grid spatio-temporal data. They enable the statistical symmetry of …
Neural ode processes
Neural Ordinary Differential Equations (NODEs) use a neural network to model the
instantaneous rate of change in the state of a system. However, despite their apparent …
instantaneous rate of change in the state of a system. However, despite their apparent …
The neural process family: Survey, applications and perspectives
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …
approximation capabilities but are limited in their abilities to learn meta representations and …
Bootstrapping neural processes
Unlike in the traditional statistical modeling for which a user typically hand-specify a prior,
Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural …
Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural …
Affective processes: stochastic modelling of temporal context for emotion and facial expression recognition
E Sanchez, MK Tellamekala… - Proceedings of the …, 2021 - openaccess.thecvf.com
Temporal context is key to the recognition of expressions of emotion. Existing methods, that
rely on recurrent or self-attention models to enforce temporal consistency, work on the …
rely on recurrent or self-attention models to enforce temporal consistency, work on the …
Contrastive conditional neural processes
Abstract Conditional Neural Processes (CNPs) bridge neural networks with probabilistic
inference to approximate functions of Stochastic Processes under meta-learning settings …
inference to approximate functions of Stochastic Processes under meta-learning settings …
Evidential conditional neural processes
Abstract The Conditional Neural Process (CNP) family of models offer a promising direction
to tackle few-shot problems by achieving better scalability and competitive predictive …
to tackle few-shot problems by achieving better scalability and competitive predictive …
[PDF][PDF] Bayesian Context Aggregation for Neural Processes.
Bayesian Context Aggregation for Neural Processes Page 1 Bayesian Context Aggregation for
Neural Processes ICLR 2021 Michael Volpp1,2, Fabian Flürenbrock1, Lukas Grossberger1 …
Neural Processes ICLR 2021 Michael Volpp1,2, Fabian Flürenbrock1, Lukas Grossberger1 …