Compute-in-memory chips for deep learning: Recent trends and prospects
Compute-in-memory (CIM) is a new computing paradigm that addresses the memory-wall
problem in hardware accelerator design for deep learning. The input vector and weight …
problem in hardware accelerator design for deep learning. The input vector and weight …
A 7-nm compute-in-memory SRAM macro supporting multi-bit input, weight and output and achieving 351 TOPS/W and 372.4 GOPS
In this work, we present a compute-in-memory (CIM) macro built around a standard two-port
compiler macro using foundry 8T bit-cell in 7-nm FinFET technology. The proposed design …
compiler macro using foundry 8T bit-cell in 7-nm FinFET technology. The proposed design …
DNN+ NeuroSim V2. 0: An end-to-end benchmarking framework for compute-in-memory accelerators for on-chip training
DNN+ NeuroSim is an integrated framework to benchmark compute-in-memory (CIM)
accelerators for deep neural networks, with hierarchical design options from device-level, to …
accelerators for deep neural networks, with hierarchical design options from device-level, to …
A review on SRAM-based computing in-memory: Circuits, functions, and applications
Z Lin, Z Tong, J Zhang, F Wang, T Xu… - Journal of …, 2022 - iopscience.iop.org
Artificial intelligence (AI) processes data-centric applications with minimal effort. However, it
poses new challenges to system design in terms of computational speed and energy …
poses new challenges to system design in terms of computational speed and energy …
Two-way transpose multibit 6T SRAM computing-in-memory macro for inference-training AI edge chips
JW Su, X Si, YC Chou, TW Chang… - IEEE Journal of Solid …, 2021 - ieeexplore.ieee.org
Computing-in-memory (CIM) based on SRAM is a promising approach to achieving energy-
efficient multiply-and-accumulate (MAC) operations in artificial intelligence (AI) edge …
efficient multiply-and-accumulate (MAC) operations in artificial intelligence (AI) edge …
Mc-cim: Compute-in-memory with monte-carlo dropouts for bayesian edge intelligence
We propose MC-CIM, a compute-in-memory (CIM) framework for robust, yet low power,
Bayesian edge intelligence. Deep neural networks (DNN) with deterministic weights cannot …
Bayesian edge intelligence. Deep neural networks (DNN) with deterministic weights cannot …
A 28 nm 16 kb bit-scalable charge-domain transpose 6T SRAM in-memory computing macro
This article presents a compact, robust, and transposable SRAM in-memory computing
(IMC) macro to support feed forward (FF) and back propagation (BP) computation within a …
(IMC) macro to support feed forward (FF) and back propagation (BP) computation within a …
XOR-CIM: Compute-in-memory SRAM architecture with embedded XOR encryption
Compute-in-memory (CIM) is a promising approach that exploits the analog computation
inside the memory array to speed up the vector-matrix multiplication (VMM) for deep neural …
inside the memory array to speed up the vector-matrix multiplication (VMM) for deep neural …
AILC: Accelerate on-chip incremental learning with compute-in-memory technology
As AI applications become pervasive on edge device, incrementally learning new tasks is
demanded for deep neural network (DNN) models. In this article, we proposed AILC, a …
demanded for deep neural network (DNN) models. In this article, we proposed AILC, a …
Secure XOR-CIM engine: Compute-in-memory SRAM architecture with embedded XOR encryption
Compute-in-memory (CIM), where information can be processed and stored at the same
locations, is emerging as a promising paradigm to address the memory wall bottleneck in …
locations, is emerging as a promising paradigm to address the memory wall bottleneck in …