Twelve scientific challenges for 6G: Rethinking the foundations of communications theory
The research in the sixth generation of wireless networks needs to tackle new challenges in
order to meet the requirements of emerging applications in terms of high data rate, low …
order to meet the requirements of emerging applications in terms of high data rate, low …
A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update
Objective. Most current electroencephalography (EEG)-based brain–computer interfaces
(BCIs) are based on machine learning algorithms. There is a large diversity of classifier …
(BCIs) are based on machine learning algorithms. There is a large diversity of classifier …
The ITensor software library for tensor network calculations
ITensor is a system for programming tensor network calculations with an interface modeled
on tensor diagram notation, which allows users to focus on the connectivity of a tensor …
on tensor diagram notation, which allows users to focus on the connectivity of a tensor …
Learning a low tensor-train rank representation for hyperspectral image super-resolution
Hyperspectral images (HSIs) with high spectral resolution only have the low spatial
resolution. On the contrary, multispectral images (MSIs) with much lower spectral resolution …
resolution. On the contrary, multispectral images (MSIs) with much lower spectral resolution …
Extensor: An accelerator for sparse tensor algebra
K Hegde, H Asghari-Moghaddam, M Pellauer… - Proceedings of the …, 2019 - dl.acm.org
Generalized tensor algebra is a prime candidate for acceleration via customized ASICs.
Modern tensors feature a wide range of data sparsity, with the density of non-zero elements …
Modern tensors feature a wide range of data sparsity, with the density of non-zero elements …
Stable low-rank tensor decomposition for compression of convolutional neural network
Most state-of-the-art deep neural networks are overparameterized and exhibit a high
computational cost. A straightforward approach to this problem is to replace convolutional …
computational cost. A straightforward approach to this problem is to replace convolutional …
[HTML][HTML] Hyper-optimized tensor network contraction
J Gray, S Kourtis - Quantum, 2021 - quantum-journal.org
Tensor networks represent the state-of-the-art in computational methods across many
disciplines, including the classical simulation of quantum many-body systems and quantum …
disciplines, including the classical simulation of quantum many-body systems and quantum …
Unsupervised generative modeling using matrix product states
Generative modeling, which learns joint probability distribution from data and generates
samples according to it, is an important task in machine learning and artificial intelligence …
samples according to it, is an important task in machine learning and artificial intelligence …
Tensor networks for dimensionality reduction and large-scale optimization: Part 2 applications and future perspectives
Part 2 of this monograph builds on the introduction to tensor networks and their operations
presented in Part 1. It focuses on tensor network models for super-compressed higher-order …
presented in Part 1. It focuses on tensor network models for super-compressed higher-order …
Low-rank compression of neural nets: Learning the rank of each layer
Y Idelbayev… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Neural net compression can be achieved by approximating each layer's weight matrix by a
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …