February 19, 2002
In a previous report we have evaluated analytically the mutual information between the firing rates of N independent units and a set of continuous+discrete stimuli, for finite N and in the limit of large noise. Here, we extend the analysis to the case of two interconnected populations, where input units are linked to output ones via gaussian weights and a threshold linear transfer function. We evaluate the information carried by M output units about continuous +discrete correlates by means of saddle point equations under the assumption of replica symmetry. Within this limitation, we analyze the dependence of the information on the ratio M/N, on the selectivity of the input units and on the level of the output noise. We show analytically, and confirm numerically, that in the limit of a linear transfer function and of a small ratio between output and input noise, the output information approaches asymptotically the information carried in input. Finally, we show that the information loss in output does not depend much on the structure of the stimulus, but mainly on the position of the threshold nonlinearity, and on the ratio between input and output noise.
Similar papers 1
July 30, 2001
In a previous report we have evaluated analytically the mutual information between the firing rates of N independent units and a set of multi-dimensional continuous+discrete stimuli, for a finite population size and in the limit of large noise. Here, we extend the analysis to the case of two interconnected populations, where input units activate output ones via gaussian weights and a threshold linear transfer function. We evaluate the information carried by a population of M ...
January 23, 2003
Recent studies have explored theoretically the ability of populations of neurons to carry information about a set of stimuli, both in the case of purely discrete or purely continuous stimuli, and in the case of multidimensional continuous angular and discrete correlates, in presence of additional quenched disorder in the distribution. An analytical expression for the mutual information has been obtained in the limit of large noise by means of the replica trick. Here we show t...
July 14, 2001
The mutual information of a single-layer perceptron with $N$ Gaussian inputs and $P$ deterministic binary outputs is studied by numerical simulations. The relevant parameters of the problem are the ratio between the number of output and input units, $\alpha = P/N$, and those describing the two-point correlations between inputs. The main motivation of this work refers to the comparison between the replica computation of the mutual information and an analytical solution valid u...
January 15, 1999
We calculate the mutual information (MI) of a two-layered neural network with noiseless, continuous inputs and binary, stochastic outputs under several assumptions on the synaptic efficiencies. The interesting regime corresponds to the limit where the number of both input and output units is large but their ratio is kept fixed at a value $\alpha$. We first present a solution for the MI using the replica technique with a replica symmetric (RS) ansatz. Then we find an exact sol...
March 13, 2001
In a recent study the initial rise of the mutual information between the firing rates of N neurons and a set of p discrete stimuli has been analytically evaluated, under the assumption that neurons fire independently of one another to each stimulus and that each conditional distribution of firing rates is gaussian. Yet real stimuli or behavioural correlates are high-dimensional, with both discrete and continuously varying features.Moreover, the gaussian approximation implies ...
October 18, 2000
We evaluate the mutual information between the input and the output of a two layer network in the case of a noisy and non-linear analogue channel. In the case where the non-linearity is small with respect to the variability in the noise, we derive an exact expression for the contribution to the mutual information given by the non-linear term in first order of perturbation theory. Finally we show how the calculation can be simplified by means of a diagrammatic expansion. Our r...
March 1, 2002
In a recent work we have introduced a novel approach to study the effect of weak non-linearities in the transfer function on the information transmitted by an analogue channel, by means of a perturbative diagrammatic expansion. We extend here the analysis to all orders in perturbation theory, which allows us to release any constraint concerning the magnitude of the expansion parameter and to establish the rules to calculate easily the contribution at any order. As an example ...
September 19, 1997
The information that a pattern of firing in the output layer of a feedforward network of threshold-linear neurons conveys about the network's inputs is considered. A replica-symmetric solution is found to be stable for all but small amounts of noise. The region of instability depends on the contribution of the threshold and the sparseness: for distributed pattern distributions, the unstable region extends to higher noise variances than for very sparse distributions, for which...
June 13, 2016
Factorizing low-rank matrices has many applications in machine learning and statistics. For probabilistic models in the Bayes optimal setting, a general expression for the mutual information has been proposed using heuristic statistical physics computations, and proven in few specific cases. Here, we show how to rigorously prove the conjectured formula for the symmetric rank-one case. This allows to express the minimal mean-square-error and to characterize the detectability p...
January 31, 2002
The capacity with which a system of independent neuron-like units represents a given set of stimuli is studied by calculating the mutual information between the stimuli and the neural responses. Both discrete noiseless and continuous noisy neurons are analyzed. In both cases, the information grows monotonically with the number of neurons considered. Under the assumption that neurons are independent, the mutual information rises linearly from zero, and approaches exponentially...