November 25, 2022
We consider dense, associative neural-networks trained by a teacher (i.e., with supervision) and we investigate their computational capabilities analytically, via statistical-mechanics of spin glasses, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as quality and quantity of the training dataset, network storage and noise, that is valid in the limit of large netw...
January 1, 2020
We consider the storage properties of temporal patterns, i.e. cycles of finite lengths, in neural networks represented by (generally asymmetric) spin glasses defined on random graphs. Inspired by the observation that dynamics on sparse systems have more basins of attractions than the dynamics of densely connected ones, we consider the attractors of a greedy dynamics in sparse topologies, considered as proxy for the stored memories. We enumerate them using numerical simulation...
July 20, 2014
Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistica...
December 2, 2019
Recently, Hopfield and Krotov introduced the concept of {\em dense associative memories} [DAM] (close to spin-glasses with $P$-wise interactions in a disordered statistical mechanical jargon): they proved a number of remarkable features these networks share and suggested their use to (partially) explain the success of the new generation of Artificial Intelligence. Thanks to a remarkable ante-litteram analysis by Baldi \& Venkatesh, among these properties, it is known these ne...
March 8, 1999
The subject of study is a neural network with binary neurons, randomly diluted synapses and variable pattern activity. We look at the system with parallel updating using a probabilistic approach to solve the one step dynamics with one condensed pattern. We derive restrictions on the storage capacity and the mutual information content occuring during the retrieval process. Special focus is on the constraints on the threshold for optimal performance. We also look at the effect ...
July 30, 2001
A model of the columnar functional organization of neocortical association areas is studied. The neuronal network is composed of many Hebbian autoassociators, or modules, each of which interacts with a relatively small number of the others. Every module encodes and stores a number of elementary percepts, or features. Memory items, or patterns, are peculiar combinations of features sparsely distributed over the multi-modular network. Any feature stored in any module can be inv...
May 12, 1998
It is well known that a sparsely coded network in which the activity level is extremely low has intriguing equilibrium properties. In the present work, we study the dynamical properties of a neural network designed to store sparsely coded sequential patterns rather than static ones. Applying the theory of statistical neurodynamics, we derive the dynamical equations governing the retrieval process which are described by some macroscopic order parameters such as the overlap. It...
April 6, 2017
The brain must robustly store a large number of memories, corresponding to the many events encountered over a lifetime. However, the number of memory states in existing neural network models either grows weakly with network size or recall fails catastrophically with vanishingly little noise. We construct an associative content-addressable memory with exponentially many stable states and robust error-correction. The network possesses expander graph connectivity on a restricted...
November 25, 2022
We consider dense, associative neural-networks trained with no supervision and we investigate their computational capabilities analytically, via a statistical-mechanics approach, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as the quality and quantity of the training dataset and the network storage, valid in the limit of large network size and structureless dat...
July 6, 2001
The retrieval behavior and thermodynamic properties of symmetrically diluted Q-Ising neural networks are derived and studied in replica-symmetric mean-field theory generalizing earlier works on either the fully connected or the symmetrical extremely diluted network. Capacity-gain parameter phase diagrams are obtained for the Q=3, Q=4 and $Q=\infty$ state networks with uniformly distributed patterns of low activity in order to search for the effects of a gradual dilution of th...