January 19, 2004
Similar papers 3
March 24, 2004
We study extremely diluted spin models of neural networks in which the connectivity evolves in time, although adiabatically slowly compared to the neurons, according to stochastic equations which on average aim to reduce frustration. The (fast) neurons and (slow) connectivity variables equilibrate separately, but at different temperatures. Our model is exactly solvable in equilibrium. We obtain phase diagrams upon making the condensed ansatz (i.e. recall of one pattern). Thes...
August 26, 1994
Using an asymmetric associative network with synchronous updating, it is possible to recall a sequence of patterns. To obtain a stable sequence generation with a large storage capacity, we introduce a threshold that eliminates the contribution of weakly correlated patterns. For this system we find a set of evolution equations for the overlaps of the states with the patterns to be recognized. We solve these equations in the limit of the stationary cycle, and obtain the critica...
November 9, 2005
A stationary state replica analysis for a dual neural network model that interpolates between a fully recurrent symmetric attractor network and a strictly feed-forward layered network, studied by Coolen and Viana, is extended in this work to account for finite dilution of the recurrent Hebbian interactions between binary Ising units within each layer. Gradual dilution is found to suppress part of the phase transitions that arise from the competition between recurrent and feed...
August 11, 2009
The synchronous dynamics and the stationary states of a recurrent attractor neural network model with competing synapses between symmetric sequence processing and Hebbian pattern reconstruction is studied in this work allowing for the presence of a self-interaction for each unit. Phase diagrams of stationary states are obtained exhibiting phases of retrieval, symmetric and period-two cyclic states as well as correlated and frozen-in states, in the absence of noise. The frozen...
February 10, 2017
Exactly solvable neural network models with asymmetric weights are rare, and exact solutions are available only in some mean-field approaches. In this article we find exact analytical solutions of an asymmetric spin-glass-like model of arbitrary size and we perform a complete study of its dynamical and statistical properties. The network has discrete-time evolution equations, binary firing rates and can be driven by noise with any distribution. We find analytical expressions ...
July 6, 2004
We investigate the properties of an autoassociative network of threshold-linear units whose synaptic connectivity is spatially structured and asymmetric. Since the methods of equilibrium statistical mechanics cannot be applied to such a network due to the lack of a Hamiltonian, we approach the problem through a signal-to-noise analysis, that we adapt to spatially organized networks. The conditions are analyzed for the appearance of stable, spatially non-uniform profiles of ac...
May 12, 2005
The thermodynamic and retrieval properties of the Blume-Emery-Griffiths neural network with synchronous updating and variable dilution are studied using replica mean-field theory. Several forms of dilution are allowed by pruning the different types of couplings present in the Hamiltonian. The appearance and properties of two-cycles are discussed. Capacity-temperature phase diagrams are derived for several values of the pattern activity. The results are compared with those for...
November 13, 2006
We extend the statistical neurodynamics to study transient dynamics of sequence processing neural networks with finite dilution, and the theoretical results is supported by the extensive numerical simulations. It is found that the order parameter equations are completely equivalent to those of the Generating Functional Method, which means that crosstalk noise is normal distribution even in the case of failure in retrieval process. In order to verify the gaussian assumption of...
December 6, 2001
Starting from the mutual information we present a method in order to find a hamiltonian for a fully connected neural network model with an arbitrary, finite number of neuron states, Q. For small initial correlations between the neurons and the patterns it leads to optimal retrieval performance. For binary neurons, Q=2, and biased patterns we recover the Hopfield model. For three-state neurons, Q=3, we find back the recently introduced Blume-Emery-Griffiths network hamiltonian...
January 27, 2012
Short-term synaptic depression and facilitation have been found to greatly influence the performance of autoassociative neural networks. However, only partial results, focused for instance on the computation of the maximum storage capacity at zero temperature, have been obtained to date. In this work, we extended the study of the effect of these synaptic mechanisms on autoassociative neural networks to more realistic and general conditions, including the presence of noise in ...