A review of binarized neural networks

T Simons, DJ Lee - Electronics, 2019 - mdpi.com
In this work, we review Binarized Neural Networks (BNNs). BNNs are deep neural networks
that use binary values for activations and weights, instead of full precision values. With …

Memristors for energy‐efficient new computing paradigms

DS Jeong, KM Kim, S Kim, BJ Choi… - Advanced Electronic …, 2016 - Wiley Online Library
In this Review, memristors are examined from the frameworks of both von Neumann and
neuromorphic computing architectures. For the former, a new logic computational process …

Optimal errors and phase transitions in high-dimensional generalized linear models

J Barbier, F Krzakala, N Macris… - Proceedings of the …, 2019 - National Acad Sciences
Generalized linear models (GLMs) are used in high-dimensional machine learning,
statistics, communications, and signal processing. In this paper we analyze GLMs when the …

Entropy and mutual information in models of deep neural networks

M Gabrié, A Manoel, C Luneau… - Advances in neural …, 2018 - proceedings.neurips.cc
We examine a class of stochastic deep learning models with a tractable method to compute
information-theoretic quantities. Our contributions are three-fold:(i) We show how entropies …

Expectation backpropagation: Parameter-free training of multilayer neural networks with continuous or discrete weights

D Soudry, I Hubara, R Meir - Advances in neural information …, 2014 - proceedings.neurips.cc
Abstract Multilayer Neural Networks (MNNs) are commonly trained using gradient descent-
based methods, such as BackPropagation (BP). Inference in probabilistic graphical models …

Unreasonable effectiveness of learning neural networks: From accessible states and robust ensembles to basic algorithmic schemes

C Baldassi, C Borgs, JT Chayes… - Proceedings of the …, 2016 - National Acad Sciences
In artificial neural networks, learning from data is a computationally demanding task in which
a large number of connection weights are iteratively tuned through stochastic-gradient …

[HTML][HTML] BS4NN: Binarized spiking neural networks with temporal coding and learning

SR Kheradpisheh, M Mirsadeghi… - Neural Processing …, 2022 - Springer
We recently proposed the S4NN algorithm, essentially an adaptation of backpropagation to
multilayer spiking neural networks that use simple non-leaky integrate-and-fire neurons and …

Subdominant dense clusters allow for simple learning and high computational performance in neural networks with discrete synapses

C Baldassi, A Ingrosso, C Lucibello, L Saglietti… - Physical review …, 2015 - APS
We show that discrete synaptic weights can be efficiently used for learning in large scale
neural systems, and lead to unanticipated computational performance. We focus on the …

Neuromorphic computing based on emerging memory technologies

B Rajendran, F Alibart - … on Emerging and Selected Topics in …, 2016 - ieeexplore.ieee.org
In this paper, we review some of the novel emerging memory technologies and how they
can enable energy-efficient implementation of large neuromorphic computing systems. We …

The committee machine: Computational to statistical gaps in learning a two-layers neural network

B Aubin, A Maillard, F Krzakala… - Advances in …, 2018 - proceedings.neurips.cc
Heuristic tools from statistical physics have been used in the past to compute the optimal
learning and generalization errors in the teacher-student scenario in multi-layer neural …