November 22, 2011
We introduce a bipartite, diluted and frustrated, network as a sparse restricted Boltzman machine and we show its thermodynamical equivalence to an associative working memory able to retrieve multiple patterns in parallel without falling into spurious states typical of classical neural networks. We focus on systems processing in parallel a finite (up to logarithmic growth in the volume) amount of patterns, mirroring the low-level storage of standard Amit-Gutfreund-Sompolinsky theory. Results obtained trough statistical mechanics, signal-to-noise technique and Monte Carlo simulations are overall in perfect agreement and carry interesting biological insights. Indeed, these associative networks pave new perspectives in the understanding of multitasking features expressed by complex systems, e.g. neural and immune networks.
Similar papers 1
August 8, 2023
A modern challenge of Artificial Intelligence is learning multiple patterns at once (i.e.parallel learning). While this can not be accomplished by standard Hebbian associative neural networks, in this paper we show how the Multitasking Hebbian Network (a variation on theme of the Hopfield model working on sparse data-sets) is naturally able to perform this complex task. We focus on systems processing in parallel a finite (up to logarithmic growth in the size of the network) a...
May 22, 2012
In this work, we first revise some extensions of the standard Hopfield model in the low storage limit, namely the correlated attractor case and the multitasking case recently introduced by the authors. The former case is based on a modification of the Hebbian prescription, which induces a coupling between consecutive patterns and this effect is tuned by a parameter $a$. In the latter case, dilution is introduced in pattern entries, in such a way that a fraction $d$ of them is...
April 14, 2014
We use belief-propagation techniques to study the equilibrium behavior of a bipartite spin-glass, with interactions between two sets of $N$ and $P = \alpha N$ spins. Each spin has a finite degree, i.e.\ number of interaction partners in the opposite set; an equivalent view is then of a system of $N$ neurons storing $P$ diluted patterns. We show that in a large part of the parameter space of noise, dilution and storage load, delimited by a critical surface, the network behaves...
May 10, 2014
In this work we solve the dynamics of pattern diluted associative networks, evolving via sequential Glauber update. We derive dynamical equations for the order parameters, that quantify the simultaneous pattern recall of the system, and analyse the nature and stability of the stationary solutions by means of linear stability analysis as well as Monte Carlo simulations. We investigate the parallel retrieval capabilities of the system in different regions of the phase space, in...
November 17, 2022
In this paper we investigate the equilibrium properties of bidirectional associative memories (BAMs). Introduced by Kosko in 1988 as a generalization of the Hopfield model to a bipartite structure, the simplest architecture is defined by two layers of neurons, with synaptic connections only between units of different layers: even without internal connections within each layer, information storage and retrieval are still possible through the reverberation of neural activities ...
February 28, 2013
Associative network models featuring multi-tasking properties have been introduced recently and studied in the low load regime, where the number $P$ of simultaneously retrievable patterns scales with the number $N$ of nodes as $P\sim \log N$. In addition to their relevance in artificial intelligence, these models are increasingly important in immunology, where stored patterns represent strategies to fight pathogens and nodes represent lymphocyte clones. They allow us to under...
July 17, 2023
We study bi-directional associative neural networks that, exposed to noisy examples of an extensive number of random archetypes, learn the latter (with or without the presence of a teacher) when the supplied information is enough: in this setting, learning is heteroassociative -- involving couples of patterns -- and it is achieved by reverberating the information depicted from the examples through the layers of the network. By adapting Guerra's interpolation technique, we pro...
March 15, 2017
Restricted Boltzmann Machines are key tools in Machine Learning and are described by the energy function of bipartite spin-glasses. From a statistical mechanical perspective, they share the same Gibbs measure of Hopfield networks for associative memory. In this equivalence, weights in the former play as patterns in the latter. As Boltzmann machines usually require real weights to be trained with gradient descent like methods, while Hopfield networks typically store binary pat...
April 16, 2013
We consider the multitasking associative network in the low-storage limit and we study its phase diagram with respect to the noise level $T$ and the degree $d$ of dilution in pattern entries. We find that the system is characterized by a rich variety of stable states, among which pure states, parallel retrieval states, hierarchically organized states and symmetric mixtures (remarkably, both even and odd), whose complexity increases as the number of patterns $P$ grows. The ana...
April 7, 2023
Ever since the last two decades of the past century pioneering studies in the field of statistical physics had focused their efforts on developing models of neural networks that could display memory storage and retrieval. Though many associative memory models were easy to handle and still quite effective to explain the basic memory retrieval processes in the brain, they were not satisfactory under the biological point of view. It became clear to scientists that a biologically...