ID: math/0409548

On mutual information, likelihood-ratios and estimation error for the additive Gaussian channel

September 28, 2004

View on ArXiv

Similar papers 4

Generalization of the de Bruijn's identity to general $\phi$-entropies and $\phi$-Fisher informations

November 28, 2016

85% Match
Irene Valero Toranzo, Steeve Zozor, Jean-Marc Brossier
Information Theory
Information Theory

In this paper, we propose generalizations of the de Bruijn's identities based on extensions of the Shannon entropy, Fisher information and their associated divergences or relative measures. The foundation of these generalizations are the $\phi$-entropies and divergences of the Csisz\'a's class (or Salicr\'u's class) considered within a multidimensional context, included the monodimensional case, and for several type of noisy channels characterized by a more general probabilit...

Find SimilarView on arXiv

I-MMSE relations in random linear estimation and a sub-extensive interpolation method

April 13, 2017

85% Match
Jean Barbier, Nicolas Macris
Information Theory
Disordered Systems and Neura...
Information Theory

Consider random linear estimation with Gaussian measurement matrices and noise. One can compute infinitesimal variations of the mutual information under infinitesimal variations of the signal-to-noise ratio or of the measurement rate. We discuss how each variation is related to the minimum mean-square error and deduce that the two variations are directly connected through a very simple identity. The main technical ingredient is a new interpolation method called "sub-extensive...

Find SimilarView on arXiv

Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation

January 20, 2017

85% Match
Jean Barbier, Nicolas Macris, ... , Krzakala Florent
Information Theory
Disordered Systems and Neura...
Information Theory
Mathematical Physics

We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type...

Find SimilarView on arXiv

MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution

June 5, 2020

85% Match
Michael Fauß, Alex Dysto, H. Vincent Poor
Information Theory
Information Theory
Other Statistics

This paper proposes a new family of lower and upper bounds on the minimum mean squared error (MMSE). The key idea is to minimize/maximize the MMSE subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian reference distribution. Both bounds are tight and are attained by Gaussian distributions whose mean is identical to that of the reference distribution and whose covariance matrix ...

Find SimilarView on arXiv

Strong Data Processing Inequalities for Input Constrained Additive Noise Channels

December 20, 2015

85% Match
Flavio P. Calmon, Yury Polyanskiy, Yihong Wu
Information Theory
Information Theory

This paper quantifies the intuitive observation that adding noise reduces available information by means of non-linear strong data processing inequalities. Consider the random variables $W\to X\to Y$ forming a Markov chain, where $Y=X+Z$ with $X$ and $Z$ real-valued, independent and $X$ bounded in $L_p$-norm. It is shown that $I(W;Y) \le F_I(I(W;X))$ with $F_I(t)<t$ whenever $t>0$, if and only if $Z$ has a density whose support is not disjoint from any translate of itself. A ...

Find SimilarView on arXiv

Measuring Information from Moments

September 2, 2021

85% Match
Wael Alghamdi, Flavio P. Calmon
Information Theory
Information Theory
Probability

We investigate the problem of representing information measures in terms of the moments of the underlying random variables. First, we derive polynomial approximations of the conditional expectation operator. We then apply these approximations to bound the best mean-square error achieved by a polynomial estimator -- referred to here as the PMMSE. In Gaussian channels, the PMMSE coincides with the minimum mean-square error (MMSE) if and only if the input is either Gaussian or c...

Find SimilarView on arXiv

Information Measures, Inequalities and Performance Bounds for Parameter Estimation in Impulsive Noise Environments

September 3, 2016

85% Match
Jihad Fahs, Ibrahim Abou-Faycal
Information Theory
Information Theory

Recent studies found that many channels are affected by additive noise that is impulsive in nature and is best explained by heavy-tailed symmetric alpha-stable distributions. Dealing with impulsive noise environments comes with an added complexity with respect to the standard Gaussian environment: the alpha-stable probability density functions have an infinite second moment and the "nice" Hilbert space structure of the space of random variables having a finite second moment i...

Find SimilarView on arXiv

Trimming the Independent Fat: Sufficient Statistics, Mutual Information, and Predictability from Effective Channel States

February 7, 2017

85% Match
Ryan G. James, John R. Mahoney, James P. Crutchfield
Statistical Mechanics
Information Theory
Information Theory
Chaotic Dynamics
Machine Learning

One of the most fundamental questions one can ask about a pair of random variables X and Y is the value of their mutual information. Unfortunately, this task is often stymied by the extremely large dimension of the variables. We might hope to replace each variable by a lower-dimensional representation that preserves the relationship with the other variable. The theoretically ideal implementation is the use of minimal sufficient statistics, where it is well-known that either X...

Find SimilarView on arXiv

Kalman Filter from the Mutual Information Perspective

January 4, 2021

84% Match
Yarong Luo, Jianlang Hu, Chi Guo
Information Theory
Information Theory

Kalman filter is a best linear unbiased state estimator. It is also comprehensible from the point view of the Bayesian estimation. However, this note gives a detailed derivation of Kalman filter from the mutual information perspective for the first time. Then we extend this result to the R\'enyi mutual information. Finally we draw the conclusion that the measurement update of the Kalman filter is the key step to minimize the uncertainty of the state of the dynamical system.

Find SimilarView on arXiv

On the Minimum Mean $p$-th Error in Gaussian Noise Channels and its Applications

July 6, 2016

84% Match
Alex Shitz Dytso, Ronit Shitz Bustin, Daniela Shitz Tuninetti, Natasha Shitz Devroye, ... , Shamai Shlomo Shitz
Information Theory
Information Theory

The problem of estimating an arbitrary random vector from its observation corrupted by additive white Gaussian noise, where the cost function is taken to be the Minimum Mean $p$-th Error (MMPE), is considered. The classical Minimum Mean Square Error (MMSE) is a special case of the MMPE. Several bounds, properties and applications of the MMPE are derived and discussed. The optimal MMPE estimator is found for Gaussian and binary input distributions. Properties of the MMPE as a ...

Find SimilarView on arXiv