ID: 1404.3654

Extensive load in multitasking associative networks

April 14, 2014

View on ArXiv
Peter Sollich, Daniele Tantari, Alessia Annibale, Adriano Barra
Condensed Matter
Disordered Systems and Neura...

We use belief-propagation techniques to study the equilibrium behavior of a bipartite spin-glass, with interactions between two sets of $N$ and $P = \alpha N$ spins. Each spin has a finite degree, i.e.\ number of interaction partners in the opposite set; an equivalent view is then of a system of $N$ neurons storing $P$ diluted patterns. We show that in a large part of the parameter space of noise, dilution and storage load, delimited by a critical surface, the network behaves as an extensive parallel processor, retrieving all $P$ patterns {\it in parallel} without falling into spurious states due to pattern cross-talk and typical of the structural glassiness built into the network. Our approach allows us to consider effects beyond those studied in replica theory so far, including pattern asymmetry and heterogeneous dilution. Parallel extensive retrieval is more robust for homogeneous degree distributions, and is not disrupted by biases in the distributions of the spin-glass links.

Similar papers 1

Associative networks with diluted patterns: dynamical analysis at low and medium load

May 10, 2014

90% Match
Silvia Bartolucci, Alessia Annibale
Disordered Systems and Neura...

In this work we solve the dynamics of pattern diluted associative networks, evolving via sequential Glauber update. We derive dynamical equations for the order parameters, that quantify the simultaneous pattern recall of the system, and analyse the nature and stability of the stationary solutions by means of linear stability analysis as well as Monte Carlo simulations. We investigate the parallel retrieval capabilities of the system in different regions of the phase space, in...

Find SimilarView on arXiv

Multitasking associative networks

November 22, 2011

89% Match
Elena Agliari, Adriano Barra, Andrea Galluzzi, ... , Moauro Francesco
Disordered Systems and Neura...

We introduce a bipartite, diluted and frustrated, network as a sparse restricted Boltzman machine and we show its thermodynamical equivalence to an associative working memory able to retrieve multiple patterns in parallel without falling into spurious states typical of classical neural networks. We focus on systems processing in parallel a finite (up to logarithmic growth in the volume) amount of patterns, mirroring the low-level storage of standard Amit-Gutfreund-Sompolinsky...

Find SimilarView on arXiv

Interpolating between boolean and extremely high noisy patterns through Minimal Dense Associative Memories

December 2, 2019

89% Match
Francesco Alemanno, Martino Centonze, Alberto Fachechi
Disordered Systems and Neura...
Machine Learning

Recently, Hopfield and Krotov introduced the concept of {\em dense associative memories} [DAM] (close to spin-glasses with $P$-wise interactions in a disordered statistical mechanical jargon): they proved a number of remarkable features these networks share and suggested their use to (partially) explain the success of the new generation of Artificial Intelligence. Thanks to a remarkable ante-litteram analysis by Baldi \& Venkatesh, among these properties, it is known these ne...

Find SimilarView on arXiv

Parallel retrieval of correlated patterns

May 22, 2012

88% Match
Elena Agliari, Adriano Barra, ... , Galluzzi Andrea
Disordered Systems and Neura...

In this work, we first revise some extensions of the standard Hopfield model in the low storage limit, namely the correlated attractor case and the multitasking case recently introduced by the authors. The former case is based on a modification of the Hebbian prescription, which induces a coupling between consecutive patterns and this effect is tuned by a parameter $a$. In the latter case, dilution is introduced in pattern entries, in such a way that a fraction $d$ of them is...

Find SimilarView on arXiv

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

November 25, 2022

88% Match
Elena Agliari, Linda Albanese, Francesco Alemanno, Andrea Alessandrelli, Adriano Barra, Fosca Giannotti, ... , Pedreschi Dino
Disordered Systems and Neura...
Machine Learning

We consider dense, associative neural-networks trained by a teacher (i.e., with supervision) and we investigate their computational capabilities analytically, via statistical-mechanics of spin glasses, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as quality and quantity of the training dataset, network storage and noise, that is valid in the limit of large netw...

Find SimilarView on arXiv

Multitasking network with fast noise

April 16, 2013

87% Match
Elena Agliari, Adriano Barra, ... , Isopi Marco
Disordered Systems and Neura...

We consider the multitasking associative network in the low-storage limit and we study its phase diagram with respect to the noise level $T$ and the degree $d$ of dilution in pattern entries. We find that the system is characterized by a rich variety of stable states, among which pure states, parallel retrieval states, hierarchically organized states and symmetric mixtures (remarkably, both even and odd), whose complexity increases as the number of patterns $P$ grows. The ana...

Find SimilarView on arXiv

Neural Networks retrieving Boolean patterns in a sea of Gaussian ones

March 15, 2017

87% Match
Elena Agliari, Adriano Barra, ... , Tantari Daniele
Disordered Systems and Neura...
Mathematical Physics

Restricted Boltzmann Machines are key tools in Machine Learning and are described by the energy function of bipartite spin-glasses. From a statistical mechanical perspective, they share the same Gibbs measure of Hopfield networks for associative memory. In this equivalence, weights in the former play as patterns in the latter. As Boltzmann machines usually require real weights to be trained with gradient descent like methods, while Hopfield networks typically store binary pat...

Find SimilarView on arXiv

Parallel Learning by Multitasking Neural Networks

August 8, 2023

87% Match
Elena Agliari, Andrea Alessandrelli, ... , Ricci-Tersenghi Federico
Disordered Systems and Neura...
Biological Physics
Machine Learning

A modern challenge of Artificial Intelligence is learning multiple patterns at once (i.e.parallel learning). While this can not be accomplished by standard Hebbian associative neural networks, in this paper we show how the Multitasking Hebbian Network (a variation on theme of the Hopfield model working on sparse data-sets) is naturally able to perform this complex task. We focus on systems processing in parallel a finite (up to logarithmic growth in the size of the network) a...

Find SimilarView on arXiv

On the number of limit cycles in diluted neural networks

January 1, 2020

87% Match
Sungmin Hwang, Enrico Lanza, Giorgio Parisi, Jacopo Rocchi, ... , Zamponi Francesco
Disordered Systems and Neura...

We consider the storage properties of temporal patterns, i.e. cycles of finite lengths, in neural networks represented by (generally asymmetric) spin glasses defined on random graphs. Inspired by the observation that dynamics on sparse systems have more basins of attractions than the dynamics of densely connected ones, we consider the attractors of a greedy dynamics in sparse topologies, considered as proxy for the stored memories. We enumerate them using numerical simulation...

Find SimilarView on arXiv
Elena Agliari, Alessia Annibale, Adriano Barra, ... , Tantari Daniele
Disordered Systems and Neura...
Cell Behavior

Associative network models featuring multi-tasking properties have been introduced recently and studied in the low load regime, where the number $P$ of simultaneously retrievable patterns scales with the number $N$ of nodes as $P\sim \log N$. In addition to their relevance in artificial intelligence, these models are increasingly important in immunology, where stored patterns represent strategies to fight pathogens and nodes represent lymphocyte clones. They allow us to under...