Complexity matters: Rethinking the latent space for generative modeling
In generative modeling, numerous successful approaches leverage a low-dimensional
latent space, eg, Stable Diffusion models the latent space induced by an encoder and …
latent space, eg, Stable Diffusion models the latent space induced by an encoder and …
1‐Lipschitz Neural Distance Fields
G Coiffier, L Béthune - Computer Graphics Forum, 2024 - Wiley Online Library
Neural implicit surfaces are a promising tool for geometry processing that represent a solid
object as the zero level set of a neural network. Usually trained to approximate a signed …
object as the zero level set of a neural network. Usually trained to approximate a signed …
The Real Tropical Geometry of Neural Networks
We consider a binary classifier defined as the sign of a tropical rational function, that is, as
the difference of two convex piecewise linear functions. The parameter space of ReLU …
the difference of two convex piecewise linear functions. The parameter space of ReLU …
Defining Neural Network Architecture through Polytope Structures of Dataset
Current theoretical and empirical research in neural networks suggests that complex
datasets require large network architectures for thorough classification, yet the precise …
datasets require large network architectures for thorough classification, yet the precise …
Implicit Hypersurface Approximation Capacity in Deep ReLU Networks
We develop a geometric approximation theory for deep feed-forward neural networks with
ReLU activations. Given a $ d $-dimensional hypersurface in $\mathbb {R}^{d+ 1} …
ReLU activations. Given a $ d $-dimensional hypersurface in $\mathbb {R}^{d+ 1} …