Recent advances and future prospects for memristive materials, devices, and systems
Memristive technology has been rapidly emerging as a potential alternative to traditional
CMOS technology, which is facing fundamental limitations in its development. Since oxide …
CMOS technology, which is facing fundamental limitations in its development. Since oxide …
A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …
massive model sizes that require significant computational and storage resources. To …
A survey on model compression for large language models
Abstract Large Language Models (LLMs) have transformed natural language processing
tasks successfully. Yet, their large size and high computational needs pose challenges for …
tasks successfully. Yet, their large size and high computational needs pose challenges for …
Thousands of conductance levels in memristors integrated on CMOS
Neural networks based on memristive devices,–have the ability to improve throughput and
energy efficiency for machine learning, and artificial intelligence, especially in edge …
energy efficiency for machine learning, and artificial intelligence, especially in edge …
Beyond transmitting bits: Context, semantics, and task-oriented communications
Communication systems to date primarily aim at reliably communicating bit sequences.
Such an approach provides efficient engineering designs that are agnostic to the meanings …
Such an approach provides efficient engineering designs that are agnostic to the meanings …
Feature dimensionality reduction: a review
W Jia, M Sun, J Lian, S Hou - Complex & Intelligent Systems, 2022 - Springer
As basic research, it has also received increasing attention from people that the “curse of
dimensionality” will lead to increase the cost of data storage and computing; it also …
dimensionality” will lead to increase the cost of data storage and computing; it also …
Sheared llama: Accelerating language model pre-training via structured pruning
The popularity of LLaMA (Touvron et al., 2023a; b) and other recently emerged moderate-
sized large language models (LLMs) highlights the potential of building smaller yet powerful …
sized large language models (LLMs) highlights the potential of building smaller yet powerful …
Digital twin enhanced federated reinforcement learning with lightweight knowledge distillation in mobile networks
The high-speed mobile networks offer great potentials to many future intelligent applications,
such as autonomous vehicles in smart transportation systems. Such networks provide the …
such as autonomous vehicles in smart transportation systems. Such networks provide the …
R-drop: Regularized dropout for neural networks
Dropout is a powerful and widely used technique to regularize the training of deep neural
networks. Though effective and performing well, the randomness introduced by dropout …
networks. Though effective and performing well, the randomness introduced by dropout …
Edgevits: Competing light-weight cnns on mobile devices with vision transformers
Self-attention based models such as vision transformers (ViTs) have emerged as a very
competitive architecture alternative to convolutional neural networks (CNNs) in computer …
competitive architecture alternative to convolutional neural networks (CNNs) in computer …