February 19, 2002
Similar papers 4
February 1, 2002
We study the capacity with which a system of independent neuron-like units represents a given set of stimuli. We assume that each neuron provides a fixed amount of information, and that the information provided by different neurons has a random overlap. We derive analytically the dependence of the mutual information between the set of stimuli and the neural responses on the number of units sampled. For a large set of stimuli, the mutual information rises linearly with the num...
September 21, 2004
Motivated by recent studies of population coding in theoretical neuroscience, we examine the optimality of a recently described form of stochastic resonance known as suprathreshold stochastic resonance, which occurs in populations of noisy threshold devices such as models of sensory neurons. Using the mutual information measure, it is shown numerically that for a random input signal, the optimal threshold distribution contains singularities. For large enough noise, this distr...
July 15, 2011
This paper introduces several fundamental concepts in information theory from the perspective of their origins in engineering. Understanding such concepts is important in neuroscience for two reasons. Simply applying formulae from information theory without understanding the assumptions behind their definitions can lead to erroneous results and conclusions. Furthermore, this century will see a convergence of information theory and neuroscience; information theory will expand ...
April 25, 2019
We investigated interactions within chimera states in a phase oscillator network with two coupled subpopulations. To quantify interactions within and between these subpopulations, we estimated the corresponding (delayed) mutual information that -- in general -- quantifies the capacity or the maximum rate at which information can be transferred to recover a sender's information at the receiver with a vanishingly low error probability. After verifying their equivalence with est...
December 7, 1999
A self-control mechanism for the dynamics of a three-state fully-connected neural network is studied through the introduction of a time-dependent threshold. The self-adapting threshold is a function of both the neural and the pattern activity in the network. The time evolution of the order parameters is obtained on the basis of a recently developed dynamical recursive scheme. In the limit of low activity the mutual information is shown to be the relevant parameter in order to...
January 8, 2015
Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between differe...
January 7, 2020
We consider a channel with a binary input X being corrupted by a continuous-valued noise that results in a continuous-valued output Y. An optimal binary quantizer is used to quantize the continuous-valued output Y to the final binary output Z to maximize the mutual information I(X; Z). We show that when the ratio of the channel conditional density r(y) = P(Y=y|X=0)/ P(Y =y|X=1) is a strictly increasing/decreasing function of y, then a quantizer having a single threshold can m...
September 30, 1999
The inclusion of a threshold in the dynamics of layered neural networks with variable activity is studied at arbitrary temperature. In particular, the effects on the retrieval quality of a self-controlled threshold obtained by forcing the neural activity to stay equal to the activity of the stored paterns during the whole retrieval process, are compared with those of a threshold chosen externally for every loading and every temperature through optimisation of the mutual infor...
April 26, 2023
Continuous attractor neural networks (CANN) form an appealing conceptual model for the storage of information in the brain. However a drawback of CANN is that they require finely tuned interactions. We here study the effect of quenched noise in the interactions on the coding of positional information within CANN. Using the replica method we compute the Fisher information for a network with position-dependent input and recurrent connections composed of a short-range (in space)...
April 8, 2013
Given the constant rise in quantity and quality of data obtained from neural systems on many scales ranging from molecular to systems', information-theoretic analyses became increasingly necessary during the past few decades in the neurosciences. Such analyses can provide deep insights into the functionality of such systems, as well as a rigid mathematical theory and quantitative measures of information processing in both healthy and diseased states of neural systems. This ch...