Twelve scientific challenges for 6G: Rethinking the foundations of communications theory

M Chafii, L Bariah, S Muhaidat… - … Surveys & Tutorials, 2023 - ieeexplore.ieee.org
The research in the sixth generation of wireless networks needs to tackle new challenges in
order to meet the requirements of emerging applications in terms of high data rate, low …

A review of classification algorithms for EEG-based brain–computer interfaces: a 10 year update

F Lotte, L Bougrain, A Cichocki, M Clerc… - Journal of neural …, 2018 - iopscience.iop.org
Objective. Most current electroencephalography (EEG)-based brain–computer interfaces
(BCIs) are based on machine learning algorithms. There is a large diversity of classifier …

The ITensor software library for tensor network calculations

M Fishman, S White, EM Stoudenmire - SciPost Physics Codebases, 2022 - scipost.org
ITensor is a system for programming tensor network calculations with an interface modeled
on tensor diagram notation, which allows users to focus on the connectivity of a tensor …

Learning a low tensor-train rank representation for hyperspectral image super-resolution

R Dian, S Li, L Fang - … on neural networks and learning systems, 2019 - ieeexplore.ieee.org
Hyperspectral images (HSIs) with high spectral resolution only have the low spatial
resolution. On the contrary, multispectral images (MSIs) with much lower spectral resolution …

Extensor: An accelerator for sparse tensor algebra

K Hegde, H Asghari-Moghaddam, M Pellauer… - Proceedings of the …, 2019 - dl.acm.org
Generalized tensor algebra is a prime candidate for acceleration via customized ASICs.
Modern tensors feature a wide range of data sparsity, with the density of non-zero elements …

Stable low-rank tensor decomposition for compression of convolutional neural network

AH Phan, K Sobolev, K Sozykin, D Ermilov… - Computer Vision–ECCV …, 2020 - Springer
Most state-of-the-art deep neural networks are overparameterized and exhibit a high
computational cost. A straightforward approach to this problem is to replace convolutional …

[HTML][HTML] Hyper-optimized tensor network contraction

J Gray, S Kourtis - Quantum, 2021 - quantum-journal.org
Tensor networks represent the state-of-the-art in computational methods across many
disciplines, including the classical simulation of quantum many-body systems and quantum …

Unsupervised generative modeling using matrix product states

ZY Han, J Wang, H Fan, L Wang, P Zhang - Physical Review X, 2018 - APS
Generative modeling, which learns joint probability distribution from data and generates
samples according to it, is an important task in machine learning and artificial intelligence …

Tensor networks for dimensionality reduction and large-scale optimization: Part 2 applications and future perspectives

A Cichocki, AH Phan, Q Zhao, N Lee… - … and Trends® in …, 2017 - nowpublishers.com
Part 2 of this monograph builds on the introduction to tensor networks and their operations
presented in Part 1. It focuses on tensor network models for super-compressed higher-order …

Low-rank compression of neural nets: Learning the rank of each layer

Y Idelbayev… - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Neural net compression can be achieved by approximating each layer's weight matrix by a
low-rank matrix. The real difficulty in doing this is not in training the resulting neural net …