September 28, 2004
Similar papers 4
November 28, 2016
In this paper, we propose generalizations of the de Bruijn's identities based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundation of these generalizations are the $\phi$-entropies and divergences of the Csisz\'a's class (or Salicr\'u's class) considered within a multidimensional context, included the monodimensional case, and for several type of noisy channels characterized by a more general probabilit...
April 13, 2017
Consider random linear estimation with Gaussian measurement matrices and noise. One can compute infinitesimal variations of the mutual information under infinitesimal variations of the signal-to-noise ratio or of the measurement rate. We discuss how each variation is related to the minimum mean-square error and deduce that the two variations are directly connected through a very simple identity. The main technical ingredient is a new interpolation method called "sub-extensive...
January 20, 2017
We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type...
June 5, 2020
This paper proposes a new family of lower and upper bounds on the minimum mean squared error (MMSE). The key idea is to minimize/maximize the MMSE subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian reference distribution. Both bounds are tight and are attained by Gaussian distributions whose mean is identical to that of the reference distribution and whose covariance matrix ...
December 20, 2015
This paper quantifies the intuitive observation that adding noise reduces available information by means of non-linear strong data processing inequalities. Consider the random variables $W\to X\to Y$ forming a Markov chain, where $Y=X+Z$ with $X$ and $Z$ real-valued, independent and $X$ bounded in $L_p$-norm. It is shown that $I(W;Y) \le F_I(I(W;X))$ with $F_I(t)<t$ whenever $t>0$, if and only if $Z$ has a density whose support is not disjoint from any translate of itself. A ...
September 2, 2021
We investigate the problem of representing information measures in terms of the moments of the underlying random variables. First, we derive polynomial approximations of the conditional expectation operator. We then apply these approximations to bound the best mean-square error achieved by a polynomial estimator -- referred to here as the PMMSE. In Gaussian channels, the PMMSE coincides with the minimum mean-square error (MMSE) if and only if the input is either Gaussian or c...
September 3, 2016
Recent studies found that many channels are affected by additive noise that is impulsive in nature and is best explained by heavy-tailed symmetric alpha-stable distributions. Dealing with impulsive noise environments comes with an added complexity with respect to the standard Gaussian environment: the alpha-stable probability density functions have an infinite second moment and the "nice" Hilbert space structure of the space of random variables having a finite second moment i...
February 7, 2017
One of the most fundamental questions one can ask about a pair of random variables X and Y is the value of their mutual information. Unfortunately, this task is often stymied by the extremely large dimension of the variables. We might hope to replace each variable by a lower-dimensional representation that preserves the relationship with the other variable. The theoretically ideal implementation is the use of minimal sufficient statistics, where it is well-known that either X...
January 4, 2021
Kalman filter is a best linear unbiased state estimator. It is also comprehensible from the point view of the Bayesian estimation. However, this note gives a detailed derivation of Kalman filter from the mutual information perspective for the first time. Then we extend this result to the R\'enyi mutual information. Finally we draw the conclusion that the measurement update of the Kalman filter is the key step to minimize the uncertainty of the state of the dynamical system.
July 6, 2016
The problem of estimating an arbitrary random vector from its observation corrupted by additive white Gaussian noise, where the cost function is taken to be the Minimum Mean $p$-th Error (MMPE), is considered. The classical Minimum Mean Square Error (MMSE) is a special case of the MMPE. Several bounds, properties and applications of the MMPE are derived and discussed. The optimal MMPE estimator is found for Gaussian and binary input distributions. Properties of the MMPE as a ...