January 19, 2004
Similar papers 5
July 29, 2005
The retrieval abilities of spatially uniform attractor networks can be measured by the average overlap between patterns and neural states. We found that metric networks, with local connections, however, can carry information structured in blocks without any global overlap. and blocks attractors. We propose a way to measure the block information, related to the fluctuation of the overlap. The phase-diagram with the transition from local to global information, shows that the st...
July 26, 1999
This contribution reviews the parallel dynamics of Q-Ising neural networks for various architectures: extremely diluted asymmetric, layered feedforward, extremely diluted symmetric, and fully connected. Using a probabilistic signal-to-noise ratio analysis, taking into account all feedback correlations, which are strongly dependent upon these architectures the evolution of the distribution of the local field is found. This leads to a recursive scheme determining the complete t...
November 28, 2019
We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian-store an amount of patterns scaling as N^{P-1}, where N denotes the number of constituting binary neurons interacting P-wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of ...
February 12, 2016
Learning in neural networks poses peculiar challenges when using discretized rather then continuous synaptic states. The choice of discrete synapses is motivated by biological reasoning and experiments, and possibly by hardware implementation considerations as well. In this paper we extend a previous large deviations analysis which unveiled the existence of peculiar dense regions in the space of synaptic states which accounts for the possibility of learning efficiently in net...
March 27, 2005
In this paper we show that during the retrieval process in a binary symmetric Hebb neural network, spatial localized states can be observed when the connectivity of the network is distance-dependent and when a constraint on the activity of the network is imposed, which forces different levels of activity in the retrieval and learning states. This asymmetry in the activity during the retrieval and learning is found to be sufficient condition in order to observe spatial localiz...
April 16, 2006
We studied autoassociative networks in which synapses are noisy on a time scale much shorter that the one for the neuron dynamics. In our model a presynaptic noise causes postsynaptic depression as recently observed in neurobiological systems. This results in a nonequilibrium condition in which the network sensitivity to an external stimulus is enhanced. In particular, the fixed points are qualitatively modified, and the system may easily scape from the attractors. As a resul...
February 13, 2012
The problem of neural network association is to retrieve a previously memorized pattern from its noisy version using a network of neurons. An ideal neural network should include three components simultaneously: a learning algorithm, a large pattern retrieval capacity and resilience against noise. Prior works in this area usually improve one or two aspects at the cost of the third. Our work takes a step forward in closing this gap. More specifically, we show that by forcing ...
December 2, 2015
Threshold-linear networks are a common class of firing rate models that describe recurrent interactions among neurons. Unlike their linear counterparts, these networks generically possess multiple stable fixed points (steady states), making them viable candidates for memory encoding and retrieval. In this work, we characterize stable fixed points of general threshold-linear networks with constant external drive, and discover constraints on the co-existence of fixed points inv...
September 19, 1997
The information that a pattern of firing in the output layer of a feedforward network of threshold-linear neurons conveys about the network's inputs is considered. A replica-symmetric solution is found to be stable for all but small amounts of noise. The region of instability depends on the contribution of the threshold and the sparseness: for distributed pattern distributions, the unstable region extends to higher noise variances than for very sparse distributions, for which...
June 28, 2016
We analyse the possible dynamical states emerging for two symmetrically pulse coupled populations of leaky integrate-and-fire neurons. In particular, we observe broken symmetry states in this set-up: namely, breathing chimeras, where one population is fully synchronized and the other is in a state of partial synchronization (PS) as well as generalized chimera states, where both populations are in PS, but with different levels of synchronization. Symmetric macroscopic states a...