Recent advances and future prospects for memristive materials, devices, and systems

MK Song, JH Kang, X Zhang, W Ji, A Ascoli… - ACS …, 2023 - ACS Publications
Memristive technology has been rapidly emerging as a potential alternative to traditional
CMOS technology, which is facing fundamental limitations in its development. Since oxide …

Compute in‐memory with non‐volatile elements for neural networks: A review from a co‐design perspective

W Haensch, A Raghunathan, K Roy… - Advanced …, 2023 - Wiley Online Library
Deep learning has become ubiquitous, touching daily lives across the globe. Today,
traditional computer architectures are stressed to their limits in efficiently executing the …

Electrochemical ionic synapses: progress and perspectives

M Huang, M Schwacke, M Onen… - Advanced …, 2023 - Wiley Online Library
Artificial neural networks based on crossbar arrays of analog programmable resistors can
address the high energy challenge of conventional hardware in artificial intelligence …

A review on device requirements of resistive random access memory (RRAM)-based neuromorphic computing

JH Yoon, YW Song, W Ham, JM Park, JY Kwon - APL Materials, 2023 - pubs.aip.org
With the arrival of the era of big data, the conventional von Neumann architecture is now
insufficient owing to its high latency and energy consumption that originate from its …

Device‐Algorithm Co‐Optimization for an On‐Chip Trainable Capacitor‐Based Synaptic Device with IGZO TFT and Retention‐Centric Tiki‐Taka Algorithm

J Won, J Kang, S Hong, N Han, M Kang… - Advanced …, 2023 - Wiley Online Library
Analog in‐memory computing synaptic devices are widely studied for efficient
implementation of deep learning. However, synaptic devices based on resistive memory …

A comprehensive review of advanced trends: from artificial synapses to neuromorphic systems with consideration of non-ideal effects

K Kim, MS Song, H Hwang, S Hwang… - Frontiers in Neuroscience, 2024 - frontiersin.org
A neuromorphic system is composed of hardware-based artificial neurons and synaptic
devices, designed to improve the efficiency of neural computations inspired by energy …

Enabling training of neural networks on noisy hardware

T Gokmen - Frontiers in Artificial Intelligence, 2021 - frontiersin.org
Deep neural networks (DNNs) are typically trained using the conventional stochastic
gradient descent (SGD) algorithm. However, SGD performs poorly when applied to train …

Retention-aware zero-shifting technique for Tiki-Taka algorithm-based analog deep learning accelerator

K Noh, H Kwak, J Son, S Kim, M Um, M Kang, D Kim… - Science …, 2024 - science.org
We present the fabrication of 4 K-scale electrochemical random-access memory (ECRAM)
cross-point arrays for analog neural network training accelerator and an electrical …

Fast and robust analog in-memory deep neural network training

MJ Rasch, F Carta, O Fagbohungbe… - Nature …, 2024 - nature.com
Analog in-memory computing is a promising future technology for efficiently accelerating
deep learning networks. While using in-memory computing to accelerate the inference …

Neural network learning using non-ideal resistive memory devices

Y Kim, T Gokmen, H Miyazoe, P Solomon… - Frontiers in …, 2022 - frontiersin.org
We demonstrate a modified stochastic gradient (Tiki-Taka v2 or TTv2) algorithm for deep
learning network training in a cross-bar array architecture based on ReRAM cells. There …