Information theoretic methods for variable selection—A review

J Mielniczuk - Entropy, 2022 - mdpi.com
We review the principal information theoretic tools and their use for feature selection, with
the main emphasis on classification problems with discrete features. Since it is known that …

Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group

J Erdmenger, K Grosvenor, R Jefferson - SciPost Physics, 2022 - scipost.org
We investigate the analogy between the renormalization group (RG) and deep neural
networks, wherein subsequent layers of neurons are analogous to successive steps along …

Controllable universal fair representation learning

Y Cui, M Chen, K Zheng, L Chen, X Zhou - Proceedings of the ACM Web …, 2023 - dl.acm.org
Learning fair and transferable representations of users that can be used for a wide spectrum
of downstream tasks (specifically, machine learning models) has great potential in fairness …

Capacity of continuous channels with memory via directed information neural estimator

Z Aharoni, D Tsur, Z Goldfeld… - 2020 IEEE International …, 2020 - ieeexplore.ieee.org
Calculating the capacity (with or without feedback) of channels with memory and continuous
alphabets is a challenging task. It requires optimizing the directed information (DI) rate over …

Disentanglement and generalization under correlation shifts

CM Funke, P Vicol, KC Wang… - Conference on …, 2022 - proceedings.mlr.press
Correlations between factors of variation are prevalent in real-world data. Exploiting such
correlations may increase predictive performance on noisy data; however, often correlations …

An Information Theoretic Approach to Interaction-Grounded Learning

X Hu, F Farnia, H Leung - arXiv preprint arXiv:2401.05015, 2024 - arxiv.org
Reinforcement learning (RL) problems where the learner attempts to infer an unobserved
reward from some feedback variables have been studied in several recent papers. The …

Neural estimator of information for time-series data with dependency

S Molavipour, H Ghourchian, G Bassi, M Skoglund - Entropy, 2021 - mdpi.com
Novel approaches to estimate information measures using neural networks are well-
celebrated in recent years both in the information theory and machine learning communities …

A data-driven analysis of secret key rate for physical layer secret key generation from wireless channels

T Matsumine, H Ochiai, J Shikata - IEEE Communications …, 2023 - ieeexplore.ieee.org
This letter considers a data-driven approach to a secret key rate analysis of physical layer
secret key generation based on mutual information neural estimator (MINE), which achieves …

Neural estimators for conditional mutual information using nearest neighbors sampling

S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set
of samples is a longstanding problem. A recent line of work in this area has leveraged the …

Contradiction neutralization for interpreting multi-layered neural networks

R Kamimura - Applied Intelligence, 2023 - Springer
The present paper aims to propose a new method for neutralizing contradictions in neural
networks. Neural networks exhibit numerous contradictions in the form of contrasts …