January 19, 2004
The dynamics and the stationary states of an exactly solvable three-state layered feed-forward neural network model with asymmetric synaptic connections, finite dilution and low pattern activity are studied in extension of a recent work on a recurrent network. Detailed phase diagrams are obtained for the stationary states and for the time evolution of the retrieval overlap with a single pattern. It is shown that the network develops instabilities for low thresholds and that there is a gradual improvement in network performance with increasing threshold up to an optimal stage. The robustness to synaptic noise is checked and the effects of dilution and of variable threshold on the information content of the network are also established.
Similar papers 1
May 26, 2003
The time evolution of an exactly solvable layered feedforward neural network with three-state neurons and optimizing the mutual information is studied for arbitrary synaptic noise (temperature). Detailed stationary temperature-capacity and capacity-activity phase diagrams are obtained. The model exhibits pattern retrieval, pattern-fluctuation retrieval and spin-glass phases. It is found that there is an improved performance in the form of both a larger critical capacity and i...
June 5, 1998
The influence of a macroscopic time-dependent threshold on the retrieval process of three-state extremely diluted neural networks is examined. If the threshold is chosen appropriately in function of the noise and the pattern activity of the network, adapting itself in the course of the time evolution, it guarantees an autonomous functioning of the network. It is found that this self-control mechanism considerably improves the retrieval quality, especially in the limit of low ...
August 14, 2002
The time evolution of the extremely diluted Blume-Emery-Griffiths neural network model is studied, and a detailed equilibrium phase diagram is obtained exhibiting pattern retrieval, fluctuation retrieval and self-sustained activity phases. It is shown that saddle-point solutions associated with fluctuation overlaps slow down considerably the flow of the network states towards the retrieval fixed points. A comparison of the performance with other three-state networks is also p...
October 20, 1998
The principle of adaptation in a noisy retrieval environment is extended here to a diluted attractor neural network of Q-state neurons trained with noisy data. The network is adapted to an appropriate noisy training overlap and training activity which are determined self-consistently by the optimized retrieval attractor overlap and activity. The optimized storage capacity and the corresponding retriever overlap are considerably enhanced by an adequate threshold in the states....
April 18, 1997
The categorization properties of an attractor network of three-state neurons which infers three-state concepts from examples are studied. The evolution equations governing the parallel dynamics at zero temperature for the overlap between the state of the network and the examples, the state of the network and the concepts as well as the neuron activity are discussed in the limit of extreme dilution. A transition from a retrieval region to a categorization region is found when ...
March 8, 1999
The subject of study is a neural network with binary neurons, randomly diluted synapses and variable pattern activity. We look at the system with parallel updating using a probabilistic approach to solve the one step dynamics with one condensed pattern. We derive restrictions on the storage capacity and the mutual information content occuring during the retrieval process. Special focus is on the constraints on the threshold for optimal performance. We also look at the effect ...
December 7, 1999
A self-control mechanism for the dynamics of a three-state fully-connected neural network is studied through the introduction of a time-dependent threshold. The self-adapting threshold is a function of both the neural and the pattern activity in the network. The time evolution of the order parameters is obtained on the basis of a recently developed dynamical recursive scheme. In the limit of low activity the mutual information is shown to be the relevant parameter in order to...
July 19, 2005
The three-state Ising neural network with synchronous updating and variable dilution is discussed starting from the appropriate Hamiltonians. The thermodynamic and retrieval properties are examined using replica mean-field theory. Capacity-temperature phase diagrams are derived for several values of the pattern activity and different gradations of dilution, and the information content is calculated. The results are compared with those for sequential updating. The effect of se...
July 1, 2005
The dynamics and the stationary states for the competition between pattern reconstruction and asymmetric sequence processing are studied here in an exactly solvable feed-forward layered neural network model of binary units and patterns near saturation. Earlier work by Coolen and Sherrington on a parallel dynamics far from saturation is extended here to account for finite stochastic noise due to a Hebbian and a sequential learning rule. Phase diagrams are obtained with station...
February 6, 2002
We discuss, in this paper, the dynamical properties of extremely diluted, non-monotonic neural networks. Assuming parallel updating and the Hebb prescription for the synaptic connections, a flow equation for the macroscopic overlap is derived. A rich dynamical phase diagram was obtained, showing a stable retrieval phase, as well as a cycle two and chaotic behavior. Numerical simulations were performed, showing good agreement with analytical results. Furthermore, the simulatio...