Information measures, inequalities and performance bounds for parameter estimation in impulsive noise environments

J Fahs, I Abou-Faycal - IEEE Transactions on Information …, 2017 - ieeexplore.ieee.org
IEEE Transactions on Information Theory, 2017ieeexplore.ieee.org
Recent studies found that many channels are affected by additive noise that is impulsive in
nature and is best explained by heavy-tailed symmetric alpha-stable distributions. Dealing
with impulsive noise environments comes with an added complexity with respect to the
standard Gaussian environment: the alpha-stable probability density functions do not
possess closed-form expressions except in few special cases. Furthermore, they have an
infinite second moment and the “nice” Hilbert space structure of the space of random …
Recent studies found that many channels are affected by additive noise that is impulsive in nature and is best explained by heavy-tailed symmetric alpha-stable distributions. Dealing with impulsive noise environments comes with an added complexity with respect to the standard Gaussian environment: the alpha-stable probability density functions do not possess closed-form expressions except in few special cases. Furthermore, they have an infinite second moment and the “nice” Hilbert space structure of the space of random variables having a finite second moment is lost along with its tools and methodologies. This is indeed the case in estimation theory, where classical tools to quantify the performance of an estimator are tightly related to the assumption of having finite variance variables. In alpha-stable environments, expressions, such as the mean square error and the Cramer-Rao bound, are hence problematic. In this paper, we tackle the parameter-estimation problem in the impulsive noise environments and develop novel tools that are tailored to the alpha-stable and heavy-tailed noise environments, tools that coincide with the standard ones adopted in the Gaussian setup, namely, a generalized “power” measure and a generalized Fisher information. We generalize known information inequalities commonly used in the Gaussian context: the de Bruijn identity, the Fisher information inequality, the isoperimetric inequality for entropies and the Cramer-Rao bound. Additionally, we derive upper bounds on the differential entropy of independent sums having a stable component. Intermediately, the new power measure is used to shed some light on the additive alpha-stable noise channel capacity in a setup that generalizes the linear average power constrained additive white Gaussian noise channel. Our theoretical findings are paralleled with numerical evaluations of various quantities and bounds using developed MATLAB packages.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果