Information theoretic methods for variable selection—A review
J Mielniczuk - Entropy, 2022 - mdpi.com
We review the principal information theoretic tools and their use for feature selection, with
the main emphasis on classification problems with discrete features. Since it is known that …
the main emphasis on classification problems with discrete features. Since it is known that …
Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group
We investigate the analogy between the renormalization group (RG) and deep neural
networks, wherein subsequent layers of neurons are analogous to successive steps along …
networks, wherein subsequent layers of neurons are analogous to successive steps along …
Controllable universal fair representation learning
Learning fair and transferable representations of users that can be used for a wide spectrum
of downstream tasks (specifically, machine learning models) has great potential in fairness …
of downstream tasks (specifically, machine learning models) has great potential in fairness …
Capacity of continuous channels with memory via directed information neural estimator
Calculating the capacity (with or without feedback) of channels with memory and continuous
alphabets is a challenging task. It requires optimizing the directed information (DI) rate over …
alphabets is a challenging task. It requires optimizing the directed information (DI) rate over …
Disentanglement and generalization under correlation shifts
Correlations between factors of variation are prevalent in real-world data. Exploiting such
correlations may increase predictive performance on noisy data; however, often correlations …
correlations may increase predictive performance on noisy data; however, often correlations …
An Information Theoretic Approach to Interaction-Grounded Learning
Reinforcement learning (RL) problems where the learner attempts to infer an unobserved
reward from some feedback variables have been studied in several recent papers. The …
reward from some feedback variables have been studied in several recent papers. The …
Neural estimator of information for time-series data with dependency
Novel approaches to estimate information measures using neural networks are well-
celebrated in recent years both in the information theory and machine learning communities …
celebrated in recent years both in the information theory and machine learning communities …
A data-driven analysis of secret key rate for physical layer secret key generation from wireless channels
T Matsumine, H Ochiai, J Shikata - IEEE Communications …, 2023 - ieeexplore.ieee.org
This letter considers a data-driven approach to a secret key rate analysis of physical layer
secret key generation based on mutual information neural estimator (MINE), which achieves …
secret key generation based on mutual information neural estimator (MINE), which achieves …
Neural estimators for conditional mutual information using nearest neighbors sampling
S Molavipour, G Bassi… - IEEE transactions on signal …, 2021 - ieeexplore.ieee.org
The estimation of mutual information (MI) or conditional mutual information (CMI) from a set
of samples is a longstanding problem. A recent line of work in this area has leveraged the …
of samples is a longstanding problem. A recent line of work in this area has leveraged the …
Contradiction neutralization for interpreting multi-layered neural networks
R Kamimura - Applied Intelligence, 2023 - Springer
The present paper aims to propose a new method for neutralizing contradictions in neural
networks. Neural networks exhibit numerous contradictions in the form of contrasts …
networks. Neural networks exhibit numerous contradictions in the form of contrasts …