An introduction to neural data compression

Y Yang, S Mandt, L Theis - Foundations and Trends® in …, 2023 - nowpublishers.com
Neural compression is the application of neural networks and other machine learning
methods to data compression. Recent advances in statistical machine learning have opened …

Semantic probabilistic layers for neuro-symbolic learning

K Ahmed, S Teso, KW Chang… - Advances in …, 2022 - proceedings.neurips.cc
We design a predictive layer for structured-output prediction (SOP) that can be plugged into
any neural network guaranteeing its predictions are consistent with a set of predefined …

Tractable control for autoregressive language generation

H Zhang, M Dang, N Peng… - … on Machine Learning, 2023 - proceedings.mlr.press
Despite the success of autoregressive large language models in text generation, it remains
a major challenge to generate text that satisfies complex constraints: sampling from the …

Semantic strengthening of neuro-symbolic learning

K Ahmed, KW Chang… - … Conference on Artificial …, 2023 - proceedings.mlr.press
Numerous neuro-symbolic approaches have recently been proposed typically with the goal
of adding symbolic knowledge to the output layer of a neural network. Ideally, such losses …

A survey of sum–product networks structural learning

R Xia, Y Zhang, X Liu, B Yang - Neural Networks, 2023 - Elsevier
Sum–product networks (SPNs) in deep probabilistic models have made great progress in
computer vision, robotics, neuro-symbolic artificial intelligence, natural language …

Continuous mixtures of tractable probabilistic models

AHC Correia, G Gala, E Quaeghebeur… - Proceedings of the …, 2023 - ojs.aaai.org
Probabilistic models based on continuous latent spaces, such as variational autoencoders,
can be understood as uncountable mixture models where components depend continuously …

Neuro-symbolic entropy regularization

K Ahmed, E Wang, KW Chang… - Uncertainty in …, 2022 - proceedings.mlr.press
In structured output prediction, the goal is to jointly predict several output variables that
together encode a structured object–a path in a graph, an entity-relation triple, or an …

Understanding the distillation process from deep generative models to tractable probabilistic circuits

X Liu, A Liu, G Van den Broeck… - … Conference on Machine …, 2023 - proceedings.mlr.press
Abstract Probabilistic Circuits (PCs) are a general and unified computational framework for
tractable probabilistic models that support efficient computation of various inference tasks …

Sum-product networks: A survey

R Sánchez-Cauce, I París… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
A sum-product network (SPN) is a probabilistic model, based on a rooted acyclic directed
graph, in which terminal nodes represent probability distributions and non-terminal nodes …

Sparse probabilistic circuits via pruning and growing

M Dang, A Liu… - Advances in Neural …, 2022 - proceedings.neurips.cc
Probabilistic circuits (PCs) are a tractable representation of probability distributions allowing
for exact and efficient computation of likelihoods and marginals. There has been significant …