Complexity matters: Rethinking the latent space for generative modeling

T Hu, F Chen, H Wang, J Li… - Advances in Neural …, 2024 - proceedings.neurips.cc
In generative modeling, numerous successful approaches leverage a low-dimensional
latent space, eg, Stable Diffusion models the latent space induced by an encoder and …

1‐Lipschitz Neural Distance Fields

G Coiffier, L Béthune - Computer Graphics Forum, 2024 - Wiley Online Library
Neural implicit surfaces are a promising tool for geometry processing that represent a solid
object as the zero level set of a neural network. Usually trained to approximate a signed …

The Real Tropical Geometry of Neural Networks

MC Brandenburg, G Loho, G Montúfar - arXiv preprint arXiv:2403.11871, 2024 - arxiv.org
We consider a binary classifier defined as the sign of a tropical rational function, that is, as
the difference of two convex piecewise linear functions. The parameter space of ReLU …

Defining Neural Network Architecture through Polytope Structures of Dataset

S Lee, A Mammadov, JC Ye - arXiv preprint arXiv:2402.02407, 2024 - arxiv.org
Current theoretical and empirical research in neural networks suggests that complex
datasets require large network architectures for thorough classification, yet the precise …

Implicit Hypersurface Approximation Capacity in Deep ReLU Networks

J Vallin, K Larsson, MG Larson - arXiv preprint arXiv:2407.03851, 2024 - arxiv.org
We develop a geometric approximation theory for deep feed-forward neural networks with
ReLU activations. Given a $ d $-dimensional hypersurface in $\mathbb {R}^{d+ 1} …