ID: cond-mat/0202315

Replica symmetric evaluation of the information transfer in a two-layer network in presence of continuous and discrete stimuli

February 19, 2002

View on ArXiv

Similar papers 3

Asymptotic Mutual Information Statistics of Separately-Correlated Rician Fading MIMO Channels

December 24, 2007

80% Match
Giorgio Taricco
Information Theory
Information Theory

Precise characterization of the mutual information of MIMO systems is required to assess the throughput of wireless communication channels in the presence of Rician fading and spatial correlation. Here, we present an asymptotic approach allowing to approximate the distribution of the mutual information as a Gaussian distribution in order to provide both the average achievable rate and the outage probability. More precisely, the mean and variance of the mutual information of t...

Find SimilarView on arXiv

Additivity of Information in Multilayer Networks via Additive Gaussian Noise Transforms

October 12, 2017

80% Match
Galen Reeves
Information Theory
Machine Learning
Information Theory
Machine Learning

Multilayer (or deep) networks are powerful probabilistic models based on multiple stages of a linear transform followed by a non-linear (possibly random) function. In general, the linear transforms are defined by matrices and the non-linear functions are defined by information channels. These models have gained great popularity due to their ability to characterize complex probabilistic relationships arising in a wide variety of inference problems. The contribution of this pap...

Find SimilarView on arXiv

The Geometry of Information Coding in Correlated Neural Populations

February 1, 2021

80% Match
Silveira Rava Azeredo da, Fred Rieke
Neurons and Cognition

Neurons in the brain represent information in their collective activity. The fidelity of this neural population code depends on whether and how variability in the response of one neuron is shared with other neurons. Two decades of studies have investigated the influence of these noise correlations on the properties of neural coding. We provide an overview of the theoretical developments on the topic. Using simple, qualitative and general arguments, we discuss, categorize, and...

Find SimilarView on arXiv

Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation

January 20, 2017

80% Match
Jean Barbier, Nicolas Macris, ... , Krzakala Florent
Information Theory
Disordered Systems and Neura...
Information Theory
Mathematical Physics

We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type...

Find SimilarView on arXiv

Adaptive thresholds for layered neural networks with synaptic noise

May 24, 2006

80% Match
D. Bolle, R. Heylen
Disordered Systems and Neura...
Statistical Mechanics

The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of layered feedforward neural network models with synaptic noise. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed.This self-control mechanism considerably improves the quality of...

Find SimilarView on arXiv

Information-Theoretic Bounds and Approximations in Neural Population Coding

November 4, 2016

80% Match
Wentao Huang, Kechen Zhang
Information Theory
Machine Learning
Information Theory

While Shannon's mutual information has widespread applications in many disciplines, for practical applications it is often difficult to calculate its value accurately for high-dimensional variables because of the curse of dimensionality. This paper is focused on effective approximation methods for evaluating mutual information in the context of neural population coding. For large but finite neural populations, we derive several information-theoretic asymptotic bounds and appr...

Find SimilarView on arXiv

Adaptive thresholds for neural networks with synaptic noise

August 2, 2007

80% Match
D. Bolle, R. Heylen
Disordered Systems and Neura...
Statistical Mechanics

The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solved numerically. In both cases it is shown that, if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall pr...

Find SimilarView on arXiv

Information transmission in a two-step cascade: Interplay of activation and repression

September 27, 2021

80% Match
Tuhin Subhra Roy, Mintu Nandi, Ayan Biswas, ... , Banik Suman K
Molecular Networks

We present an information-theoretic formalism to study signal transduction in four architectural variants of a model two-step cascade with increasing input population. Our results categorize these four types into two classes depending upon the effect played out by activation and repression on mutual information, net synergy, and signal-to-noise ratio. Within the Gaussian framework and using the linear noise approximation, we derive the analytic expressions for these metrics t...

Find SimilarView on arXiv

Methods for Estimating Neural Information

December 20, 2022

80% Match
James V Stone
Neurons and Cognition
Information Theory
Information Theory

Estimating the Shannon information associated with individual neurons is a non-trivial problem. Three key methods used to estimate the mutual information between neuron inputs and outputs are described, and a list of further readings is provided.

Find SimilarView on arXiv

Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems

November 11, 2014

80% Match
Adam B. Barrett
Information Theory
Information Theory
Neurons and Cognition

To fully characterize the information that two `source' variables carry about a third `target' variable, one must decompose the total information into redundant, unique and synergistic components, i.e. obtain a partial information decomposition (PID). However Shannon's theory of information does not provide formulae to fully determine these quantities. Several recent studies have begun addressing this. Some possible definitions for PID quantities have been proposed, and some ...

Find SimilarView on arXiv