A review of Shannon and differential entropy rate estimation
A Feutrill, M Roughan - Entropy, 2021 - mdpi.com
In this paper, we present a review of Shannon and differential entropy rate estimation
techniques. Entropy rate, which measures the average information gain from a stochastic …
techniques. Entropy rate, which measures the average information gain from a stochastic …
Strong data-processing inequalities for channels and Bayesian networks
Y Polyanskiy, Y Wu - Convexity and Concentration, 2017 - Springer
The data-processing inequality, that is, I (U; Y)≤ I (U; X) for a Markov chain U→ X→ Y, has
been the method of choice for proving impossibility (converse) results in information theory …
been the method of choice for proving impossibility (converse) results in information theory …
Coding Schemes Based on Reed-Muller Codes for (d,∞)-RLL Input-Constrained Channels
VA Rameshwar, N Kashyap - IEEE Transactions on Information …, 2023 - ieeexplore.ieee.org
The paper considers coding schemes derived from Reed-Muller (RM) codes, for
transmission over input-constrained memoryless channels. Our focus is on the-runlength …
transmission over input-constrained memoryless channels. Our focus is on the-runlength …
On the entropy of a noisy function
A Samorodnitsky - IEEE Transactions on Information Theory, 2016 - ieeexplore.ieee.org
Let 0<; ϵ<; 1/2 be a noise parameter, and let Tϵ be the noise operator acting on functions
on the Boolean cube {0, 1} n. Let f be a nonnegative function on {0, 1} n. We upper bound …
on the Boolean cube {0, 1} n. Let f be a nonnegative function on {0, 1} n. We upper bound …
Optimal list decoding from noisy entropy inequality
J Hązła - 2023 IEEE International Symposium on Information …, 2023 - ieeexplore.ieee.org
A noisy entropy inequality for boolean functions by Samorodnitsky is applied to binary
codes. It is shown that a binary code that achieves capacity on the binary erasure channel …
codes. It is shown that a binary code that achieves capacity on the binary erasure channel …
Strong data-processing inequalities for channels and Bayesian networks
Y Polyanskiy, Y Wu - arXiv preprint arXiv:1508.06025, 2015 - arxiv.org
The data-processing inequality, that is, $ I (U; Y)\le I (U; X) $ for a Markov chain $ U\to X\to Y
$, has been the method of choice for proving impossibility (converse) results in information …
$, has been the method of choice for proving impossibility (converse) results in information …
A generalized framework for Kullback–Leibler Markov aggregation
We propose an information-theoretic Markov aggregation framework that is motivated by two
objectives: 1) The Markov chain observed through the aggregation mapping should be …
objectives: 1) The Markov chain observed through the aggregation mapping should be …
Computable lower bounds for capacities of input-driven finite-state channels
VA Rameshwar, N Kashyap - 2020 IEEE International …, 2020 - ieeexplore.ieee.org
This paper studies the capacities of input-driven finite-state channels, ie, channels whose
current state is a time-invariant deterministic function of the previous state and the current …
current state is a time-invariant deterministic function of the previous state and the current …
Lower Bounds on Mutual Information for Linear Codes Transmitted over Binary Input Channels, and for Information Combining
U Erez, O Ordentlich, S Shamai - arXiv preprint arXiv:2401.14710, 2024 - arxiv.org
It has been known for a long time that the mutual information between the input sequence
and output of a binary symmetric channel (BSC) is upper bounded by the mutual information …
and output of a binary symmetric channel (BSC) is upper bounded by the mutual information …
Entropy rate bounds of integer-valued processes via second-order statistics
R Tamir - IEEE Transactions on Information Theory, 2022 - ieeexplore.ieee.org
This work contains two single-letter upper bounds on the entropy rate of an integer-valued
stationary stochastic process, which only depend on second-order statistics, and are …
stationary stochastic process, which only depend on second-order statistics, and are …