January 19, 2004
Similar papers 2
December 17, 1999
The mutual information, I, of the three-state neural network can be obtained exactly for the mean-field architecture, as a function of three macroscopic parameters: the overlap, the neural activity and the {\em activity-overlap}, i.e. the overlap restricted to the active neurons. We perform an expansion of I on the overlap and the activity-overlap, around their values for neurons almost independent on the patterns. From this expansion we obtain an expression for a Hamiltonian...
March 20, 2003
A recent dynamic mean-field theory for sequence processing in fully connected neural networks of Hopfield-type (During, Coolen and Sherrington, 1998) is extended and analized here for a symmetrically diluted network with finite connectivity near saturation. Equations for the dynamics and the stationary states are obtained for the macroscopic observables and the precise equivalence is established with the single-pattern retrieval problem in a layered feed-forward network with ...
August 2, 2007
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of both layered feedforward and fully connected neural network models with synaptic noise. These two types of architectures require a different method to be solved numerically. In both cases it is shown that, if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall pr...
May 24, 2006
The inclusion of a macroscopic adaptive threshold is studied for the retrieval dynamics of layered feedforward neural network models with synaptic noise. It is shown that if the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the stored patterns, adapting itself automatically in the course of the recall process, an autonomous functioning of the network is guaranteed.This self-control mechanism considerably improves the quality of...
September 30, 1999
The inclusion of a threshold in the dynamics of layered neural networks with variable activity is studied at arbitrary temperature. In particular, the effects on the retrieval quality of a self-controlled threshold obtained by forcing the neural activity to stay equal to the activity of the stored paterns during the whole retrieval process, are compared with those of a threshold chosen externally for every loading and every temperature through optimisation of the mutual infor...
May 10, 2014
In this work we solve the dynamics of pattern diluted associative networks, evolving via sequential Glauber update. We derive dynamical equations for the order parameters, that quantify the simultaneous pattern recall of the system, and analyse the nature and stability of the stationary solutions by means of linear stability analysis as well as Monte Carlo simulations. We investigate the parallel retrieval capabilities of the system in different regions of the phase space, in...
April 16, 2013
We consider the multitasking associative network in the low-storage limit and we study its phase diagram with respect to the noise level $T$ and the degree $d$ of dilution in pattern entries. We find that the system is characterized by a rich variety of stable states, among which pure states, parallel retrieval states, hierarchically organized states and symmetric mixtures (remarkably, both even and odd), whose complexity increases as the number of patterns $P$ grows. The ana...
July 6, 2001
The retrieval behavior and thermodynamic properties of symmetrically diluted Q-Ising neural networks are derived and studied in replica-symmetric mean-field theory generalizing earlier works on either the fully connected or the symmetrical extremely diluted network. Capacity-gain parameter phase diagrams are obtained for the Q=3, Q=4 and $Q=\infty$ state networks with uniformly distributed patterns of low activity in order to search for the effects of a gradual dilution of th...
May 12, 1998
It is well known that a sparsely coded network in which the activity level is extremely low has intriguing equilibrium properties. In the present work, we study the dynamical properties of a neural network designed to store sparsely coded sequential patterns rather than static ones. Applying the theory of statistical neurodynamics, we derive the dynamical equations governing the retrieval process which are described by some macroscopic order parameters such as the overlap. It...
May 10, 2018
We study with numerical simulation the possible limit behaviors of synchronous discrete-time deterministic recurrent neural networks composed of N binary neurons as a function of a network's level of dilution and asymmetry. The network dilution measures the fraction of neuron couples that are connected, and the network asymmetry measures to what extent the underlying connectivity matrix is asymmetric. For each given neural network, we study the dynamical evolution of all the ...