ID: math/0409548

On mutual information, likelihood-ratios and estimation error for the additive Gaussian channel

September 28, 2004

View on ArXiv

Similar papers 3

Differential Entropy of the Conditional Expectation under Additive Gaussian Noise

June 8, 2021

85% Match
Arda Atalik, Alper Köse, Michael Gastpar
Information Theory
Signal Processing
Information Theory

The conditional mean is a fundamental and important quantity whose applications include the theories of estimation and rate-distortion. It is also notoriously difficult to work with. This paper establishes novel bounds on the differential entropy of the conditional mean in the case of finite-variance input signals and additive Gaussian noise. The main result is a new lower bound in terms of the differential entropies of the input signal and the noisy observation. The main res...

Find SimilarView on arXiv

Information Complexity and Estimation

August 4, 2011

85% Match
Dror Baron
Information Theory
Information Theory

We consider an input $x$ generated by an unknown stationary ergodic source $X$ that enters a signal processing system $J$, resulting in $w=J(x)$. We observe $w$ through a noisy channel, $y=z(w)$; our goal is to estimate x from $y$, $J$, and knowledge of $f_{Y|W}$. This is universal estimation, because $f_X$ is unknown. We provide a formulation that describes a trade-off between information complexity and noise. Initial theoretical, algorithmic, and experimental evidence is pr...

Find SimilarView on arXiv

On the Relationship between Mutual Information and Minimum Mean-Square Errors in Stochastic Dynamical Systems

October 5, 2007

85% Match
Francisco J. Piera, Patricio Parada
Information Theory
Information Theory

We consider a general stochastic input-output dynamical system with output evolving in time as the solution to a functional coefficients, It\^{o}'s stochastic differential equation, excited by an input process. This general class of stochastic systems encompasses not only the classical communication channel models, but also a wide variety of engineering systems appearing through a whole range of applications. For this general setting we find analogous of known relationships l...

Find SimilarView on arXiv

Gaussian Assumption: the Least Favorable but the Most Useful

November 20, 2012

85% Match
Sangwoo Park, Erchin Serpedin, Khalid Qaraqe
Information Theory
Information Theory

This paper focuses on three contributions. First, a connection between the result, proposed by Stoica and Babu, and the recent information theoretic results, the worst additive noise lemma and the isoperimetric inequality for entropies, is illustrated. Second, information theoretic and estimation theoretic justifications for the fact that the Gaussian assumption leads to the largest Cram\'{e}r-Rao lower bound (CRLB) is presented. Third, a slight extension of this result to th...

Find SimilarView on arXiv

On MMSE Properties and I-MMSE Implications in Parallel MIMO Gaussian Channels

March 26, 2012

85% Match
Ronit Shitz Bustin, Miquel Shitz Payaro, ... , Shamai Shlomo Shitz
Information Theory
Information Theory

The scalar additive Gaussian noise channel has the "single crossing point" property between the minimum-mean square error (MMSE) in the estimation of the input given the channel output, assuming a Gaussian input to the channel, and the MMSE assuming an arbitrary input. This paper extends the result to the parallel MIMO additive Gaussian channel in three phases: i) The channel matrix is the identity matrix, and we limit the Gaussian input to a vector of Gaussian i.i.d. element...

Find SimilarView on arXiv

Investigation of Alternative Measures for Mutual Information

February 2, 2022

85% Match
Bulut Kuskonmaz, Jaron Skovsted Gundersen, Rafal Wisniewski
Information Theory
Cryptography and Security
Information Theory

Mutual information $I(X;Y)$ is a useful definition in information theory to estimate how much information the random variable $Y$ holds about the random variable $X$. One way to define the mutual information is by comparing the joint distribution of $X$ and $Y$ with the product of the marginals through the KL-divergence. If the two distributions are close to each other there will be almost no leakage of $X$ from $Y$ since the two variables are close to being independent. In t...

Find SimilarView on arXiv

Information Geometric Approach to Bayesian Lower Error Bounds

January 15, 2018

85% Match
M. Ashok Kumar, Kumar Vijay Mishra
Information Theory
Information Theory

Information geometry describes a framework where probability densities can be viewed as differential geometry structures. This approach has shown that the geometry in the space of probability distributions that are parameterized by their covariance matrix is linked to the fundamentals concepts of estimation theory. In particular, prior work proposes a Riemannian metric - the distance between the parameterized probability distributions - that is equivalent to the Fisher Inform...

Find SimilarView on arXiv

Non-Linear Transformations of Gaussians and Gaussian-Mixtures with implications on Estimation and Information Theory

November 25, 2011

85% Match
Paolo Banelli
Information Theory
Information Theory
Probability
Statistics Theory
Statistics Theory

This paper investigates the statistical properties of non-linear transformations (NLT) of random variables, in order to establish useful tools for estimation and information theory. Specifically, the paper focuses on linear regression analysis of the NLT output and derives sufficient general conditions to establish when the input-output regression coefficient is equal to the \emph{partial} regression coefficient of the output with respect to a (additive) part of the input. A ...

Find SimilarView on arXiv

Additivity of Information in Multilayer Networks via Additive Gaussian Noise Transforms

October 12, 2017

85% Match
Galen Reeves
Information Theory
Machine Learning
Information Theory
Machine Learning

Multilayer (or deep) networks are powerful probabilistic models based on multiple stages of a linear transform followed by a non-linear (possibly random) function. In general, the linear transforms are defined by matrices and the non-linear functions are defined by information channels. These models have gained great popularity due to their ability to characterize complex probabilistic relationships arising in a wide variety of inference problems. The contribution of this pap...

Find SimilarView on arXiv

Mutual Information and Conditional Mean Prediction Error

July 26, 2014

85% Match
Clive G. Bowsher, Margaritis Voliotis
cs.IT
math.IT
math.PR
math.ST
physics.bio-ph
physics.data-an
stat.TH

Mutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to better known statistical procedures is still poorly understood. Here we explore new connections between mutual information and regression-based...

Find SimilarView on arXiv