September 28, 2004
Similar papers 5
March 28, 2016
We consider the estimation of a n-dimensional vector x from the knowledge of noisy and possibility non-linear element-wise measurements of xxT , a very generic problem that contains, e.g. stochastic 2-block model, submatrix localization or the spike perturbation of random matrices. We use an interpolation method proposed by Guerra and later refined by Korada and Macris. We prove that the Bethe mutual information (related to the Bethe free energy and conjectured to be exact by...
February 20, 2019
In this paper, tight upper and lower bounds are derived on the weighted sum of minimum mean-squared errors for additive Gaussian noise channels. The bounds are obtained by constraining the input distribution to be close to a Gaussian reference distribution in terms of the Kullback--Leibler divergence. The distributions that attain these bounds are shown to be Gaussian whose covariance matrices are defined implicitly via systems of matrix equations. Furthermore, the estimators...
March 11, 2009
Within the framework of linear vector Gaussian channels with arbitrary signaling, closed-form expressions for the Jacobian of the minimum mean square error and Fisher information matrices with respect to arbitrary parameters of the system are calculated in this paper. Capitalizing on prior research where the minimum mean square error and Fisher information matrices were linked to information-theoretic quantities through differentiation, closed-form expressions for the Hessian...
June 21, 2021
The presence of mutual information in the research of deep learning has grown significantly. It has been proven that mutual information can be a good objective function to build a robust deep learning model. Most of the researches utilize estimation methods to approximate the true mutual information. This technical report delivers an extensive study about definitions as well as properties of mutual information. This article then delivers some reviews and current drawbacks of ...
July 7, 2020
Information theory provides a fundamental framework for the quantification of information flows through channels, formally Markov kernels. However, quantities such as mutual information and conditional mutual information do not necessarily reflect the causal nature of such flows. We argue that this is often the result of conditioning based on sigma algebras that are not associated with the given channels. We propose a version of the (conditional) mutual information based on f...
October 2, 2009
In this paper, we establish the connections of the fundamental limitations in feedback communication, estimation, and feedback control over Gaussian channels, from a unifying perspective for information, estimation, and control. The optimal feedback communication system over a Gaussian necessarily employs the Kalman filter (KF) algorithm, and hence can be transformed into an estimation system and a feedback control system over the same channel. This follows that the informati...
April 27, 2014
Fundamental relations between information and estimation have been established in the literature for the discrete-time Gaussian and Poisson channels. In this work, we demonstrate that such relations hold for a much larger class of observation models. We introduce the natural family of discrete-time L\'evy channels where the distribution of the output conditioned on the input is infinitely divisible. For L\'evy channels, we establish new representations relating the mutual inf...
March 15, 2024
We derive a general upper bound to mutual information in terms of the Fisher information. The bound may be further used to derive a lower bound for Bayesian quadratic cost. These two provide alternatives to the Efroimovich and to the van Trees inequality that are useful also for classes of prior distributions where the latter ones give trivial bounds. We illustrate the usefulness of our bounds with a case study in quantum phase estimation. Here, they allow us to adapt to mutu...
April 18, 2023
Fundamental limitations or performance trade-offs/limits are important properties and constraints of control and filtering systems. Among various trade-off metrics, total information rate, which characterizes the sensitivity trade-offs and average performance of control and filtering systems, is conventionally studied by using the (differential) entropy rate and Kolmogorov-Bode formula. In this paper, by extending the famous I-MMSE (mutual information -- minimum mean-square e...
August 12, 2010
We derive an approximate expression for mutual information in a broad class of discrete-time stationary channels with continuous input, under the constraint of vanishing input amplitude or power. The approximation describes the input by its covariance matrix, while the channel properties are described by the Fisher information matrix. This separation of input and channel properties allows us to analyze the optimality conditions in a convenient way. We show that input correlat...