ID: cond-mat/0202315

Replica symmetric evaluation of the information transfer in a two-layer network in presence of continuous and discrete stimuli

February 19, 2002

View on ArXiv

Similar papers 5

How do correlations shape the landscape of information?

December 1, 2023

79% Match
Ching-Peng Huang
Information Theory
Algebraic Geometry
Information Theory
Biological Physics
Neurons and Cognition

We explore a few common models on how correlations affect information. The main model considered is the Shannon mutual information $I(S:R_1,\cdots, R_i)$ over distributions with marginals $P_{S,R_i}$ fixed for each $i$, with the analogy in which $S$ is the stimulus and $R_i$'s are neurons. We work out basic models in details, using algebro-geometric tools to write down discriminants that separate distributions with distinct qualitative behaviours in the probability simplex in...

Find SimilarView on arXiv

Stochastic facilitation in heteroclinic communication channels

October 23, 2021

79% Match
Giovanni Sirio Carmantini, Fabio Schittler Neves, ... , Rodrigues Serafim
Neurons and Cognition
Neural and Evolutionary Comp...

Biological neural systems encode and transmit information as patterns of activity tracing complex trajectories in high-dimensional state-spaces, inspiring alternative paradigms of information processing. Heteroclinic networks, naturally emerging in artificial neural systems, are networks of saddles in state-space that provide a transparent approach to generate complex trajectories via controlled switches among interconnected saddles. External signals induce specific switching...

Find SimilarView on arXiv

Estimating Information-Theoretic Quantities

January 8, 2015

79% Match
Robin A. A. Ince, Simon R. Schultz, Stefano Panzeri
Neurons and Cognition

Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. It has a number of useful properties: it is a general measure sensitive to any relationship, not only linear effects; it has meaningful units which in many cases allow direct comparison between differe...

Find SimilarView on arXiv

Multilayer neural networks with extensively many hidden units

April 1, 2001

79% Match
Michal Rosen-Zvi, Andreas Engel, Ido Kanter
Statistical Mechanics
Disordered Systems and Neura...

The information processing abilities of a multilayer neural network with a number of hidden units scaling as the input dimension are studied using statistical mechanics methods. The mapping from the input layer to the hidden units is performed by general symmetric Boolean functions whereas the hidden layer is connected to the output by either discrete or continuous couplings. Introducing an overlap in the space of Boolean functions as order parameter the storage capacity if f...

Find SimilarView on arXiv

Signal processing in local neuronal circuits based on activity-dependent noise and competition

July 15, 2009

79% Match
Vladislav Volman, Herbert Levine
Neurons and Cognition

We study the characteristics of weak signal detection by a recurrent neuronal network with plastic synaptic coupling. It is shown that in the presence of an asynchronous component in synaptic transmission, the network acquires selectivity with respect to the frequency of weak periodic stimuli. For non-periodic frequency-modulated stimuli, the response is quantified by the mutual information between input (signal) and output (network's activity), and is optimized by synaptic d...

Find SimilarView on arXiv

High-dimensional rank-one nonsymmetric matrix decomposition: the spherical case

April 15, 2020

79% Match
Clément Luneau, Nicolas Macris, Jean Barbier
Probability
Information Theory
Information Theory
Mathematical Physics

We consider the problem of estimating a rank-one nonsymmetric matrix under additive white Gaussian noise. The matrix to estimate can be written as the outer product of two vectors and we look at the special case in which both vectors are uniformly distributed on spheres. We prove a replica-symmetric formula for the average mutual information between these vectors and the observations in the high-dimensional regime. This goes beyond previous results which considered vectors wi...

Find SimilarView on arXiv

Carnot in the Information Age: Discrete Symmetric Channels

July 27, 2008

79% Match
Ido Kanter, Ori Shental, ... , Yacov Nadav
Statistical Mechanics
Information Theory
Information Theory

Modeling communication channels as thermal systems results in Hamiltonians which are an explicit function of the temperature. The first two authors have recently generalized the second thermodynamic law to encompass systems with temperature-dependent energy levels, $dQ=TdS+<d\mathcal{E}/dT>dT$, where {$<\cdot>$} denotes averaging over the Boltzmann distribution, recomputing the mutual information and other main properties of the popular Gaussian channel. Here the mutual infor...

Find SimilarView on arXiv

Synaptic Transmission: An Information-Theoretic Perspective

September 22, 1998

79% Match
A. California Institute of Technology, Pasadena Manwani, C. California Institute of Technology, Pasadena Koch
Neurons and Cognition

Here we analyze synaptic transmission from an information-theoretic perspective. We derive closed-form expressions for the lower-bounds on the capacity of a simple model of a cortical synapse under two explicit coding paradigms. Under the ``signal estimation'' paradigm, we assume the signal to be encoded in the mean firing rate of a Poisson neuron. The performance of an optimal linear estimator of the signal then provides a lower bound on the capacity for signal estimation. U...

Find SimilarView on arXiv

A layered neural network with three-state neurons optimizing the mutual information

May 26, 2003

79% Match
D. Bolle, R. Jr. Erichsen,, W. K. Theumann
Disordered Systems and Neura...
Statistical Mechanics

The time evolution of an exactly solvable layered feedforward neural network with three-state neurons and optimizing the mutual information is studied for arbitrary synaptic noise (temperature). Detailed stationary temperature-capacity and capacity-activity phase diagrams are obtained. The model exhibits pattern retrieval, pattern-fluctuation retrieval and spin-glass phases. It is found that there is an improved performance in the form of both a larger critical capacity and i...

Find SimilarView on arXiv

Universal properties of correlation transfer in integrate-and-fire neurons

March 16, 2007

79% Match
Eric Shea-Brown, Kresimir Josic, ... , Doiron Brent
Neurons and Cognition
Disordered Systems and Neura...

One of the fundamental characteristics of a nonlinear system is how it transfers correlations in its inputs to correlations in its outputs. This is particularly important in the nervous system, where correlations between spiking neurons are prominent. Using linear response and asymptotic methods for pairs of unconnected integrate-and-fire (IF) neurons receiving white noise inputs, we show that this correlation transfer depends on the output spike firing rate in a strong, ster...

Find SimilarView on arXiv