An introduction to neural data compression
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …
methods to data compression. Recent advances in statistical machine learning have opened …
Semantic probabilistic layers for neuro-symbolic learning
We design a predictive layer for structured-output prediction (SOP) that can be plugged into
any neural network guaranteeing its predictions are consistent with a set of predefined …
any neural network guaranteeing its predictions are consistent with a set of predefined …
Tractable control for autoregressive language generation
Despite the success of autoregressive large language models in text generation, it remains
a major challenge to generate text that satisfies complex constraints: sampling from the …
a major challenge to generate text that satisfies complex constraints: sampling from the …
Semantic strengthening of neuro-symbolic learning
Numerous neuro-symbolic approaches have recently been proposed typically with the goal
of adding symbolic knowledge to the output layer of a neural network. Ideally, such losses …
of adding symbolic knowledge to the output layer of a neural network. Ideally, such losses …
A survey of sum–product networks structural learning
R Xia, Y Zhang, X Liu, B Yang - Neural Networks, 2023 - Elsevier
Sum–product networks (SPNs) in deep probabilistic models have made great progress in
computer vision, robotics, neuro-symbolic artificial intelligence, natural language …
computer vision, robotics, neuro-symbolic artificial intelligence, natural language …
Continuous mixtures of tractable probabilistic models
Probabilistic models based on continuous latent spaces, such as variational autoencoders,
can be understood as uncountable mixture models where components depend continuously …
can be understood as uncountable mixture models where components depend continuously …
Neuro-symbolic entropy regularization
In structured output prediction, the goal is to jointly predict several output variables that
together encode a structured object–a path in a graph, an entity-relation triple, or an …
together encode a structured object–a path in a graph, an entity-relation triple, or an …
Understanding the distillation process from deep generative models to tractable probabilistic circuits
Abstract Probabilistic Circuits (PCs) are a general and unified computational framework for
tractable probabilistic models that support efficient computation of various inference tasks …
tractable probabilistic models that support efficient computation of various inference tasks …
Sum-product networks: A survey
R Sánchez-Cauce, I París… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
A sum-product network (SPN) is a probabilistic model, based on a rooted acyclic directed
graph, in which terminal nodes represent probability distributions and non-terminal nodes …
graph, in which terminal nodes represent probability distributions and non-terminal nodes …
Sparse probabilistic circuits via pruning and growing
Probabilistic circuits (PCs) are a tractable representation of probability distributions allowing
for exact and efficient computation of likelihoods and marginals. There has been significant …
for exact and efficient computation of likelihoods and marginals. There has been significant …