February 19, 2002
Similar papers 5
December 1, 2023
We explore a few common models on how correlations affect information. The main model considered is the Shannon mutual information $I(S:R_1,\cdots, R_i)$ over distributions with marginals $P_{S,R_i}$ fixed for each $i$, with the analogy in which $S$ is the stimulus and $R_i$'s are neurons. We work out basic models in details, using algebro-geometric tools to write down discriminants that separate distributions with distinct qualitative behaviours in the probability simplex in...
October 23, 2021
Biological neural systems encode and transmit information as patterns of activity tracing complex trajectories in high-dimensional state-spaces, inspiring alternative paradigms of information processing. Heteroclinic networks, naturally emerging in artificial neural systems, are networks of saddles in state-space that provide a transparent approach to generate complex trajectories via controlled switches among interconnected saddles. External signals induce specific switching...
January 8, 2015
Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between differe...
April 1, 2001
The information processing abilities of a multilayer neural network with a number of hidden units scaling as the input dimension are studied using statistical mechanics methods. The mapping from the input layer to the hidden units is performed by general symmetric Boolean functions whereas the hidden layer is connected to the output by either discrete or continuous couplings. Introducing an overlap in the space of Boolean functions as order parameter the storage capacity if f...
July 15, 2009
We study the characteristics of weak signal detection by a recurrent neuronal network with plastic synaptic coupling. It is shown that in the presence of an asynchronous component in synaptic transmission, the network acquires selectivity with respect to the frequency of weak periodic stimuli. For non-periodic frequency-modulated stimuli, the response is quantified by the mutual information between input (signal) and output (network's activity), and is optimized by synaptic d...
April 15, 2020
We consider the problem of estimating a rank-one nonsymmetric matrix under additive white Gaussian noise. The matrix to estimate can be written as the outer product of two vectors and we look at the special case in which both vectors are uniformly distributed on spheres. We prove a replica-symmetric formula for the average mutual information between these vectors and the observations in the high-dimensional regime. This goes beyond previous results which considered vectors wi...
July 27, 2008
Modeling communication channels as thermal systems results in Hamiltonians which are an explicit function of the temperature. The first two authors have recently generalized the second thermodynamic law to encompass systems with temperature-dependent energy levels, $dQ=TdS+<d\mathcal{E}/dT>dT$, where {$<\cdot>$} denotes averaging over the Boltzmann distribution, recomputing the mutual information and other main properties of the popular Gaussian channel. Here the mutual infor...
September 22, 1998
Here we analyze synaptic transmission from an information-theoretic perspective. We derive closed-form expressions for the lower-bounds on the capacity of a simple model of a cortical synapse under two explicit coding paradigms. Under the ``signal estimation'' paradigm, we assume the signal to be encoded in the mean firing rate of a Poisson neuron. The performance of an optimal linear estimator of the signal then provides a lower bound on the capacity for signal estimation. U...
May 26, 2003
The time evolution of an exactly solvable layered feedforward neural network with three-state neurons and optimizing the mutual information is studied for arbitrary synaptic noise (temperature). Detailed stationary temperature-capacity and capacity-activity phase diagrams are obtained. The model exhibits pattern retrieval, pattern-fluctuation retrieval and spin-glass phases. It is found that there is an improved performance in the form of both a larger critical capacity and i...
March 16, 2007
One of the fundamental characteristics of a nonlinear system is how it transfers correlations in its inputs to correlations in its outputs. This is particularly important in the nervous system, where correlations between spiking neurons are prominent. Using linear response and asymptotic methods for pairs of unconnected integrate-and-fire (IF) neurons receiving white noise inputs, we show that this correlation transfer depends on the output spike firing rate in a strong, ster...