September 28, 2004
This paper considers the model of an arbitrary distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean square error of the non-causal estimator and the likelihood ratio between y and \omega are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean square error. These results are applied to derive infinite dimensional versions of the Fisher information and the de Bruijn identity. The derivation of the results is based on the Malliavin calculus.
Similar papers 1
December 23, 2004
This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This rel...
October 7, 2010
I present several new relations between mutual information (MI) and statistical estimation error for a system that can be regarded simultaneously as a communication channel and as an estimator of an input parameter. I first derive a second-order result between MI and Fisher information (FI) that is valid for sufficiently narrow priors, but arbitrary channels. A second relation furnishes a lower bound on the MI in terms of the minimum mean-squared error (MMSE) on the Bayesian ...
October 30, 2011
Many of the classical and recent relations between information and estimation in the presence of Gaussian noise can be viewed as identities between expectations of random quantities. These include the I-MMSE relationship of Guo et al.; the relative entropy and mismatched estimation relationship of Verd\'{u}; the relationship between causal estimation and mutual information of Duncan, and its extension to the presence of feedback by Kadota et al.; the relationship between caus...
November 17, 2009
In continuation to a recent work on the statistical--mechanical analysis of minimum mean square error (MMSE) estimation in Gaussian noise via its relation to the mutual information (the I-MMSE relation), here we propose a simple and more direct relationship between optimum estimation and certain information measures (e.g., the information density and the Fisher information), which can be viewed as partition functions and hence are amenable to analysis using statistical--mecha...
September 30, 2006
The model considered is that of ``signal plus white noise.'' Known connections between the noncausal filtering error and mutual information are combined with new ones involving the causal estimation error, in a general abstract setup. The results are shown to be invariant under a wide class of causality patterns; they are applied to the derivation of the causal estimation error of a Gaussian nonstationary filtering problem and to a multidimensional extension of the Yovits--Ja...
February 29, 2008
We consider linear time-varying channels with additive white Gaussian noise. For a large class of such channels we derive rigorous estimates of the eigenvalues of the correlation matrix of the effective channel in terms of the sampled time-varying transfer function and, thus, provide a theoretical justification for a relationship that has been frequently observed in the literature. We then use this eigenvalue estimate to derive an estimate of the mutual information of the cha...
September 8, 2007
In recent years, infinite-dimensional methods have been introduced for the Gaussian channels estimation. The aim of this paper is to study the application of similar methods to Poisson channels. In particular we compute the Bayesian estimator of a Poisson channel using the likelihood ratio and the discrete Malliavin gradient. This algorithm is suitable for numerical implementation via the Monte-Carlo scheme. As an application we provide an new proof of the formula obtained re...
June 24, 2015
Unveiling a fundamental link between information theory and estimation theory, the I-MMSE relation by Guo, Shamai and Verdu~\cite{gu05}, together with its numerous extensions, has great theoretical significance and various practical applications. On the other hand, its influences to date have been restricted to channels without feedback or memory, due to the absence of its extensions to such channels. In this paper, we propose extensions of the I-MMSE relation to discrete-tim...
December 29, 2008
We consider the problem of signal estimation (denoising) from a statistical mechanical perspective, using a relationship between the minimum mean square error (MMSE), of estimating a signal, and the mutual information between this signal and its noisy version. The paper consists of essentially two parts. In the first, we derive several statistical-mechanical relationships between a few important quantities in this problem area, such as the MMSE, the differential entropy, the ...
February 11, 2021
We consider the processing of statistical samples $X\sim P_\theta$ by a channel $p(y|x)$, and characterize how the statistical information from the samples for estimating the parameter $\theta\in\mathbb{R}^d$ can scale with the mutual information or capacity of the channel. We show that if the statistical model has a sub-Gaussian score function, then the trace of the Fisher information matrix for estimating $\theta$ from $Y$ can scale at most linearly with the mutual informat...