On universal features for high-dimensional learning and inference
SL Huang, A Makur, GW Wornell, L Zheng - arXiv preprint arXiv …, 2019 - arxiv.org
We consider the problem of identifying universal low-dimensional features from high-
dimensional data for inference tasks in settings involving learning. For such problems, we …
dimensional data for inference tasks in settings involving learning. For such problems, we …
An information theoretic interpretation to deep neural networks
With the unprecedented performance achieved by deep learning, it is commonly believed
that deep neural networks (DNNs) attempt to extract informative features for learning tasks …
that deep neural networks (DNNs) attempt to extract informative features for learning tasks …
Comparison of Contraction Coefficients for f-Divergences
A Makur, L Zheng - Problems of Information Transmission, 2020 - Springer
Contraction coefficients are distribution dependent constants that are used to sharpen
standard data processing inequalities for f-divergences (or relative f-entropies) and produce …
standard data processing inequalities for f-divergences (or relative f-entropies) and produce …
[PDF][PDF] Quantum algorithms for data analysis
A Luongo - 2020 - quantumalgorithms.org
Quantum algorithms for data analysis Page 1 Quantum algorithms for data analysis
Alessandro Luongo 2024-12-08 Page 2 2 Page 3 Contents 1 Preface 7 1.1 Abstract …
Alessandro Luongo 2024-12-08 Page 2 2 Page 3 Contents 1 Preface 7 1.1 Abstract …
Quantum algorithms for SVD-based data representation and analysis
This paper narrows the gap between previous literature on quantum linear algebra and
practical data analysis on a quantum computer, formalizing quantum procedures that speed …
practical data analysis on a quantum computer, formalizing quantum procedures that speed …
On estimation of modal decompositions
A Makur, GW Wornell, L Zheng - 2020 IEEE International …, 2020 - ieeexplore.ieee.org
A modal decomposition is a useful tool that deconstructs the statistical dependence between
two random variables by decomposing their joint distribution into orthogonal modes …
two random variables by decomposing their joint distribution into orthogonal modes …
Generalizing correspondence analysis for applications in machine learning
Correspondence analysis (CA) is a multivariate statistical tool used to visualize and interpret
data dependencies by finding maximally correlated embeddings of pairs of random …
data dependencies by finding maximally correlated embeddings of pairs of random …
Operator SVD with Neural Networks via Nested Low-Rank Approximation
Computing eigenvalue decomposition (EVD) of a given linear operator, or finding its leading
eigenvalues and eigenfunctions, is a fundamental task in many machine learning and …
eigenvalues and eigenfunctions, is a fundamental task in many machine learning and …
Information-Theoretic Tools for Machine Learning Beyond Accuracy
H Hsu - 2023 - search.proquest.com
For the past decades, information theory and machine learning have propelled each other
forward. Information theory has provided mathematical tools to tackle emerging challenges …
forward. Information theory has provided mathematical tools to tackle emerging challenges …
[PDF][PDF] Improving the robustness of deep neural networks to adversarial perturbations
J Peck - 2023 - backoffice.biblio.ugent.be
Over the past decade, artificial neural networks have ushered in a revolution in science and
society. Nowadays, neural networks are applied to various problems such as speech …
society. Nowadays, neural networks are applied to various problems such as speech …