July 14, 2001
The mutual information of a single-layer perceptron with $N$ Gaussian inputs and $P$ deterministic binary outputs is studied by numerical simulations. The relevant parameters of the problem are the ratio between the number of output and input units, $\alpha = P/N$, and those describing the two-point correlations between inputs. The main motivation of this work refers to the comparison between the replica computation of the mutual information and an analytical solution valid up to $\alpha \sim O(1)$. The most relevant results are: (1) the simulation supports the validity of the analytical prediction, and (2) it also verifies a previously proposed conjecture that the replica solution interpolates well between large and small values of $\alpha$.
Similar papers 1
January 15, 1999
We calculate the mutual information (MI) of a two-layered neural network with noiseless, continuous inputs and binary, stochastic outputs under several assumptions on the synaptic efficiencies. The interesting regime corresponds to the limit where the number of both input and output units is large but their ratio is kept fixed at a value $\alpha$. We first present a solution for the MI using the replica technique with a replica symmetric (RS) ansatz. Then we find an exact sol...
February 19, 2002
In a previous report we have evaluated analytically the mutual information between the firing rates of N independent units and a set of continuous+discrete stimuli, for finite N and in the limit of large noise. Here, we extend the analysis to the case of two interconnected populations, where input units are linked to output ones via gaussian weights and a threshold linear transfer function. We evaluate the information carried by M output units about continuous +discrete corre...
July 30, 2001
In a previous report we have evaluated analytically the mutual information between the firing rates of N independent units and a set of multi-dimensional continuous+discrete stimuli, for a finite population size and in the limit of large noise. Here, we extend the analysis to the case of two interconnected populations, where input units activate output ones via gaussian weights and a threshold linear transfer function. We evaluate the information carried by a population of M ...
July 8, 2016
We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections, a problem relevant in compressed sensing, sparse superposition codes or code division multiple access just to cite few. There has been a number of works considering the mutual information for this problem using the heuristic replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-type interpolatio...
June 13, 2016
Factorizing low-rank matrices has many applications in machine learning and statistics. For probabilistic models in the Bayes optimal setting, a general expression for the mutual information has been proposed using heuristic statistical physics computations, and proven in few specific cases. Here, we show how to rigorously prove the conjectured formula for the symmetric rank-one case. This allows to express the minimal mean-square-error and to characterize the detectability p...
July 15, 2002
The performance of a lossy data compression scheme for uniformly biased Boolean messages is investigated via methods of statistical mechanics. Inspired by a formal similarity to the storage capacity problem in the research of neural networks, we utilize a perceptron of which the transfer function is appropriately designed in order to compress and decode the messages. Employing the replica method, we analytically show that our scheme can achieve the optimal performance known i...
January 23, 2003
Recent studies have explored theoretically the ability of populations of neurons to carry information about a set of stimuli, both in the case of purely discrete or purely continuous stimuli, and in the case of multidimensional continuous angular and discrete correlates, in presence of additional quenched disorder in the distribution. An analytical expression for the mutual information has been obtained in the limit of large noise by means of the replica trick. Here we show t...
May 11, 2007
Using analytical methods of statistical mechanics, we analyse the typical behaviour of a multiple-input multiple-output (MIMO) Gaussian channel with binary inputs under LDPC network coding and joint decoding. The saddle point equations for the replica symmetric solution are found in particular realizations of this channel, including a small and large number of transmitters and receivers. In particular, we examine the cases of a single transmitter, a single receiver and the sy...
December 24, 2007
Precise characterization of the mutual information of MIMO systems is required to assess the throughput of wireless communication channels in the presence of Rician fading and spatial correlation. Here, we present an asymptotic approach allowing to approximate the distribution of the mutual information as a Gaussian distribution in order to provide both the average achievable rate and the outage probability. More precisely, the mean and variance of the mutual information of t...
January 20, 2017
We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type...