September 28, 2004
Similar papers 2
June 7, 2010
This paper studies the effect of parametric mismatch in minimum mean square error (MMSE) estimation. In particular, we consider the problem of estimating the input signal from the output of an additive white Gaussian channel whose gain is fixed, but unknown. The input distribution is known, and the estimation process consists of two algorithms. First, a channel estimator blindly estimates the channel gain using past observations. Second, a mismatched MMSE estimator, optimized...
March 4, 2023
The I-MMSE formula connects two important quantities in information theory and estimation theory. It states that in a gaussian channel, the derivative of the mutual information is one-half of the minimum mean-squared error. Higher derivatives of the mutual information is related to estimation errors of higher moments, however a general formula is unknown. In this paper, we derive a general formula for the derivatives of mutual information between inputs and outputs of multipl...
February 13, 2017
We study the relationship between information- and estimation-theoretic quantities in time-evolving systems. We focus on the Fokker-Planck channel defined by a general stochastic differential equation, and show that the time derivatives of entropy, KL divergence, and mutual information are characterized by estimation-theoretic quantities involving an appropriate generalization of the Fisher information. Our results vastly extend De Bruijn's identity and the classical I-MMSE r...
January 15, 2014
Unveiling a fundamental link between information theory and estimation theory, the I-MMSE relationship by Guo, Shamai and Verdu~\cite{gu05}, together with its numerous extensions, has great theoretical significance and various practical applications. On the other hand, its influences to date have been restricted to channels without feedback or memory, due to the absence of its extensions to such channels. In this paper, we propose extensions of the I-MMSE relationship to disc...
April 18, 2017
Fundamental relations between information and estimation have been established in the literature for the continuous-time Gaussian and Poisson channels, in a long line of work starting from the classical representation theorems by Duncan and Kabanov respectively. In this work, we demonstrate that such relations hold for a much larger family of continuous-time channels. We introduce the family of semi-martingale channels where the channel output is a semi-martingale stochastic ...
February 8, 2013
We consider mean squared estimation with lookahead of a continuous-time signal corrupted by additive white Gaussian noise. We show that the mutual information rate function, i.e., the mutual information rate as function of the signal-to-noise ratio (SNR), does not, in general, determine the minimum mean squared error (MMSE) with fixed finite lookahead, in contrast to the special cases with 0 and infinite lookahead (filtering and smoothing errors), respectively, which were pre...
November 9, 2000
In the analysis of time series from nonlinear sources, mutual information (MI) is used as a nonlinear statistical criterion for the selection of an appropriate time delay in time delay reconstruction of the state space. MI is a statistic over the sets of sequences associated with the dynamical source, and we examine here the distribution of MI, thus going beyond the familiar analysis of its average alone. We give for the first time the distribution of MI for a standard, class...
July 8, 2016
We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections, a problem relevant in compressed sensing, sparse superposition codes or code division multiple access just to cite few. There has been a number of works considering the mutual information for this problem using the heuristic replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-type interpolatio...
February 1, 2017
We consider the high-dimensional inference problem where the signal is a low-rank matrix which is corrupted by an additive Gaussian noise. Given a probabilistic model for the low-rank matrix, we compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error, while the rank of the signal remains constant. This allows to locate the information-theoretic threshold for this estim...
April 20, 2010
Consider the minimum mean-square error (MMSE) of estimating an arbitrary random variable from its observation contaminated by Gaussian noise. The MMSE can be regarded as a function of the signal-to-noise ratio (SNR) as well as a functional of the input distribution (of the random variable to be estimated). It is shown that the MMSE is concave in the input distribution at any given SNR. For a given input distribution, the MMSE is found to be infinitely differentiable at all po...