April 17, 2000
We study a simple extended model of oscillator neural networks capable of storing sparsely coded phase patterns, in which information is encoded both in the mean firing rate and in the timing of spikes. Applying the methods of statistical neurodynamics to our model, we theoretically investigate the model's associative memory capability by evaluating its maximum storage capacities and deriving its basins of attraction. It is shown that, as in the Hopfield model, the storage ca...
December 21, 2000
We estimate the critical capacity of the zero-temperature Hopfield model by using a novel and rigorous method. The probability of having a stable fixed point is one when $\alpha\le 0.113$ for a large number of neurons. This result is an advance on all rigorous results in the literature and the relationship between the capacity $\alpha$ and retrieval errors obtained here for small $\alpha$ coincides with replica calculation results.
July 8, 2024
It has been recently shown that a learning transition happens when a Hopfield Network stores examples generated as superpositions of random features, where new attractors corresponding to such features appear in the model. In this work we reveal that the network also develops attractors corresponding to previously unseen examples generated with the same set of features. We explain this surprising behaviour in terms of spurious states of the learned features: we argue that, in...
March 13, 2014
Recent advances in associative memory design through structured pattern sets and graph-based inference algorithms have allowed reliable learning and recall of an exponential number of patterns. Although these designs correct external errors in recall, they assume neurons that compute noiselessly, in contrast to the highly variable neurons in brain regions thought to operate associatively such as hippocampus and olfactory cortex. Here we consider associative memories with no...
December 15, 2023
Statistical mechanics of spin glasses is one of the main strands toward a comprehension of information processing by neural networks and learning machines. Tackling this approach, at the fairly standard replica symmetric level of description, recently Hebbian attractor networks with multi-node interactions (often called Dense Associative Memories) have been shown to outperform their classical pairwise counterparts in a number of tasks, from their robustness against adversaria...
May 29, 2024
We discuss prototype formation in the Hopfield network. Typically, Hebbian learning with highly correlated states leads to degraded memory performance. We show this type of learning can lead to prototype formation, where unlearned states emerge as representatives of large correlated subsets of states, alleviating capacity woes. This process has similarities to prototype learning in human cognition. We provide a substantial literature review of prototype learning in associativ...
April 25, 2017
Recent studies point to the potential storage of a large number of patterns in the celebrated Hopfield associative memory model, well beyond the limits obtained previously. We investigate the properties of new fixed points to discover that they exhibit instabilities for small perturbations and are therefore of limited value as associative memories. Moreover, a large deviations approach also shows that errors introduced to the original patterns induce additional errors and inc...
November 2, 2022
The aim of this thesis is to compare the capacity of different models of neural networks. We start by analysing the problem solving capacity of a single perceptron using a simple combinatorial argument. After some observations on the storage capacity of a basic network, known as an associative memory, we introduce a powerful statistical mechanical approach to calculate its capacity in the training rule-dependent Hopfield model. With the aim of finding a more general definitio...
February 19, 2021
Motivated by biological considerations, we study sparse neural maps from an input layer to a target layer with sparse activity, and specifically the problem of storing $K$ input-target associations $(x,y)$, or memories, when the target vectors $y$ are sparse. We mathematically prove that $K$ undergoes a phase transition and that in general, and somewhat paradoxically, sparsity in the target layers increases the storage capacity of the map. The target vectors can be chosen arb...
November 28, 2019
In this work we develop analytical techniques to investigate a broad class of associative neural networks set in the high-storage regime. These techniques translate the original statistical-mechanical problem into an analytical-mechanical one which implies solving a set of partial differential equations, rather than tackling the canonical probabilistic route. We test the method on the classical Hopfield model - where the cost function includes only two-body interactions (i.e....