ID: math/0409548

On mutual information, likelihood-ratios and estimation error for the additive Gaussian channel

September 28, 2004

View on ArXiv

Similar papers 2

On Regret of Parametric Mismatch in Minimum Mean Square Error Estimation

June 7, 2010

87% Match
Majid Fozunbal
Information Theory
Information Theory

This paper studies the effect of parametric mismatch in minimum mean square error (MMSE) estimation. In particular, we consider the problem of estimating the input signal from the output of an additive white Gaussian channel whose gain is fixed, but unknown. The input distribution is known, and the estimation process consists of two algorithms. First, a channel estimator blindly estimates the channel gain using past observations. Second, a mismatched MMSE estimator, optimized...

Find SimilarView on arXiv

Derivatives of mutual information in Gaussian channels

March 4, 2023

86% Match
Minh-Toan Nguyen
Information Theory
Information Theory

The I-MMSE formula connects two important quantities in information theory and estimation theory. It states that in a gaussian channel, the derivative of the mutual information is one-half of the minimum mean-squared error. Higher derivatives of the mutual information is related to estimation errors of higher moments, however a general formula is unknown. In this paper, we derive a general formula for the derivatives of mutual information between inputs and outputs of multipl...

Find SimilarView on arXiv

Information and estimation in Fokker-Planck channels

February 13, 2017

86% Match
Andre Wibisono, Varun Jog, Po-Ling Loh
Information Theory
Information Theory
Statistics Theory
Statistics Theory

We study the relationship between information- and estimation-theoretic quantities in time-evolving systems. We focus on the Fokker-Planck channel defined by a general stochastic differential equation, and show that the time derivatives of entropy, KL divergence, and mutual information are characterized by estimation-theoretic quantities involving an appropriate generalization of the Fisher information. Our results vastly extend De Bruijn's identity and the classical I-MMSE r...

Find SimilarView on arXiv

Extensions of the I-MMSE Relationship to Gaussian Channels with Feedback and Memory

January 15, 2014

86% Match
Guangyue Han, Jian Song
Information Theory
Information Theory

Unveiling a fundamental link between information theory and estimation theory, the I-MMSE relationship by Guo, Shamai and Verdu~\cite{gu05}, together with its numerous extensions, has great theoretical significance and various practical applications. On the other hand, its influences to date have been restricted to channels without feedback or memory, due to the absence of its extensions to such channels. In this paper, we propose extensions of the I-MMSE relationship to disc...

Find SimilarView on arXiv

Mutual Information, Relative Entropy and Estimation Error in Semi-martingale Channels

April 18, 2017

86% Match
Jiantao Jiao, Kartik Venkat, Tsachy Weissman
Information Theory
Information Theory

Fundamental relations between information and estimation have been established in the literature for the continuous-time Gaussian and Poisson channels, in a long line of work starting from the classical representation theorems by Duncan and Kabanov respectively. In this work, we demonstrate that such relations hold for a much larger family of continuous-time channels. We introduce the family of semi-martingale channels where the channel output is a semi-martingale stochastic ...

Find SimilarView on arXiv

Information, Estimation, and Lookahead in the Gaussian channel

February 8, 2013

86% Match
Kartik Venkat, Tsachy Weissman, ... , Shamai Shlomo
Information Theory
Information Theory

We consider mean squared estimation with lookahead of a continuous-time signal corrupted by additive white Gaussian noise. We show that the mutual information rate function, i.e., the mutual information rate as function of the signal-to-noise ratio (SNR), does not, in general, determine the minimum mean squared error (MMSE) with fixed finite lookahead, in contrast to the special cases with 0 and infinite lookahead (filtering and smoothing errors), respectively, which were pre...

Find SimilarView on arXiv

Distribution of Mutual Information

November 9, 2000

86% Match
Henry D. I. Abarbanel, Naoki Masuda, ... , Tumer Evren
Chaotic Dynamics

In the analysis of time series from nonlinear sources, mutual information (MI) is used as a nonlinear statistical criterion for the selection of an appropriate time delay in time delay reconstruction of the state space. MI is a statistic over the sets of sequences associated with the dynamical source, and we examine here the distribution of MI, thus going beyond the familiar analysis of its average alone. We give for the first time the distribution of MI for a standard, class...

Find SimilarView on arXiv

The Mutual Information in Random Linear Estimation

July 8, 2016

86% Match
Jean Barbier, Mohamad Dia, ... , Krzakala Florent
Information Theory
Information Theory
Mathematical Physics

We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections, a problem relevant in compressed sensing, sparse superposition codes or code division multiple access just to cite few. There has been a number of works considering the mutual information for this problem using the heuristic replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-type interpolatio...

Find SimilarView on arXiv

Fundamental limits of low-rank matrix estimation: the non-symmetric case

February 1, 2017

86% Match
Léo Miolane
Probability

We consider the high-dimensional inference problem where the signal is a low-rank matrix which is corrupted by an additive Gaussian noise. Given a probabilistic model for the low-rank matrix, we compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error, while the rank of the signal remains constant. This allows to locate the information-theoretic threshold for this estim...

Find SimilarView on arXiv

Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error

April 20, 2010

85% Match
Dongning Shitz Guo, Yihong Shitz Wu, ... , Verdu Sergio
Information Theory
Information Theory

Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signal-to-noise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all po...

Find SimilarView on arXiv