Pixelated butterfly: Simple and efficient sparse training for neural network models

T Dao, B Chen, K Liang, J Yang, Z Song… - arXiv preprint arXiv …, 2021 - arxiv.org
Overparameterized neural networks generalize well but are expensive to train. Ideally, one
would like to reduce their computational cost while retaining their generalization benefits …

Tunable efficient unitary neural networks (eunn) and their application to rnns

L Jing, Y Shen, T Dubcek, J Peurifoy… - International …, 2017 - proceedings.mlr.press
Using unitary (instead of general) matrices in artificial neural networks (ANNs) is a promising
way to solve the gradient explosion/vanishing problem, as well as to enable ANNs to learn …

Matrix optimization on universal unitary photonic devices

S Pai, B Bartlett, O Solgaard, DAB Miller - Physical review applied, 2019 - APS
Universal unitary photonic devices can apply arbitrary unitary transformations to a vector of
input modes and provide a promising hardware platform for fast and energy-efficient …

Evolution strategies for continuous optimization: A survey of the state-of-the-art

Z Li, X Lin, Q Zhang, H Liu - Swarm and Evolutionary Computation, 2020 - Elsevier
Evolution strategies are a class of evolutionary algorithms for black-box optimization and
achieve state-of-the-art performance on many benchmarks and real-world applications …

Learning quantum data with the quantum earth mover's distance

BT Kiani, G De Palma, M Marvian… - Quantum Science and …, 2022 - iopscience.iop.org
Quantifying how far the output of a learning algorithm is from its target is an essential task in
machine learning. However, in quantum settings, the loss landscapes of commonly used …

Kaleidoscope: An efficient, learnable representation for all structured linear maps

T Dao, NS Sohoni, A Gu, M Eichhorn, A Blonder… - arXiv preprint arXiv …, 2020 - arxiv.org
Modern neural network architectures use structured linear transformations, such as low-rank
matrices, sparse matrices, permutations, and the Fourier transform, to improve inference …

Exploring shallow-depth boson sampling: Toward a scalable quantum advantage

B Go, C Oh, L Jiang, H Jeong - Physical Review A, 2024 - APS
Boson sampling is a sampling task proven to be hard to simulate efficiently using classical
computers under plausible assumptions, which makes it an appealing candidate for …

Efficient identification of butterfly sparse matrix factorizations

L Zheng, E Riccietti, R Gribonval - SIAM Journal on Mathematics of Data …, 2023 - SIAM
Fast transforms correspond to factorizations of the form, where each factor is sparse and
possibly structured. This paper investigates essential uniqueness of such factorizations, ie …

Butterfly transform: An efficient fft based neural architecture design

KA Vahid, A Prabhu, A Farhadi… - 2020 IEEE/CVF …, 2020 - ieeexplore.ieee.org
In this paper, we show that extending the butterfly operations from the FFT algorithm to a
general Butterfly Transform (BFT) can be beneficial in building an efficient block structure for …

Butterfly transform: An efficient fft based neural architecture design

A Prabhu, A Farhadi… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
In this paper, we show that extending the butterfly operations from the FFT algorithm to a
general Butterfly Transform (BFT) can be beneficial in building an efficient block structure for …