Pixelated butterfly: Simple and efficient sparse training for neural network models
Overparameterized neural networks generalize well but are expensive to train. Ideally, one
would like to reduce their computational cost while retaining their generalization benefits …
would like to reduce their computational cost while retaining their generalization benefits …
Tunable efficient unitary neural networks (eunn) and their application to rnns
Using unitary (instead of general) matrices in artificial neural networks (ANNs) is a promising
way to solve the gradient explosion/vanishing problem, as well as to enable ANNs to learn …
way to solve the gradient explosion/vanishing problem, as well as to enable ANNs to learn …
Matrix optimization on universal unitary photonic devices
Universal unitary photonic devices can apply arbitrary unitary transformations to a vector of
input modes and provide a promising hardware platform for fast and energy-efficient …
input modes and provide a promising hardware platform for fast and energy-efficient …
Evolution strategies for continuous optimization: A survey of the state-of-the-art
Evolution strategies are a class of evolutionary algorithms for black-box optimization and
achieve state-of-the-art performance on many benchmarks and real-world applications …
achieve state-of-the-art performance on many benchmarks and real-world applications …
Learning quantum data with the quantum earth mover's distance
Quantifying how far the output of a learning algorithm is from its target is an essential task in
machine learning. However, in quantum settings, the loss landscapes of commonly used …
machine learning. However, in quantum settings, the loss landscapes of commonly used …
Kaleidoscope: An efficient, learnable representation for all structured linear maps
Modern neural network architectures use structured linear transformations, such as low-rank
matrices, sparse matrices, permutations, and the Fourier transform, to improve inference …
matrices, sparse matrices, permutations, and the Fourier transform, to improve inference …
Exploring shallow-depth boson sampling: Toward a scalable quantum advantage
Boson sampling is a sampling task proven to be hard to simulate efficiently using classical
computers under plausible assumptions, which makes it an appealing candidate for …
computers under plausible assumptions, which makes it an appealing candidate for …
Efficient identification of butterfly sparse matrix factorizations
Fast transforms correspond to factorizations of the form, where each factor is sparse and
possibly structured. This paper investigates essential uniqueness of such factorizations, ie …
possibly structured. This paper investigates essential uniqueness of such factorizations, ie …
Butterfly transform: An efficient fft based neural architecture design
In this paper, we show that extending the butterfly operations from the FFT algorithm to a
general Butterfly Transform (BFT) can be beneficial in building an efficient block structure for …
general Butterfly Transform (BFT) can be beneficial in building an efficient block structure for …
Butterfly transform: An efficient fft based neural architecture design
In this paper, we show that extending the butterfly operations from the FFT algorithm to a
general Butterfly Transform (BFT) can be beneficial in building an efficient block structure for …
general Butterfly Transform (BFT) can be beneficial in building an efficient block structure for …