ID: 2304.14964

The Exponential Capacity of Dense Associative Memories

April 28, 2023

View on ArXiv
Carlo Lucibello, Marc Mézard
Condensed Matter
Computer Science
Mathematics
Disordered Systems and Neura...
Information Theory
Information Theory

Recent generalizations of the Hopfield model of associative memories are able to store a number $P$ of random patterns that grows exponentially with the number $N$ of neurons, $P=\exp(\alpha N)$. Besides the huge storage capacity, another interesting feature of these networks is their connection to the attention mechanism which is part of the Transformer architectures widely applied in deep learning. In this work, we study a generic family of pattern ensembles using a statistical mechanics analysis which gives exact asymptotic thresholds for the retrieval of a typical pattern, $\alpha_1$, and lower bounds for the maximum of the load $\alpha$ for which all patterns can be retrieved, $\alpha_c$, as well as sizes of attraction basins. We discuss in detail the cases of Gaussian and spherical patterns, and show that they display rich and qualitatively different phase diagrams.

Similar papers 1

Interpolating between boolean and extremely high noisy patterns through Minimal Dense Associative Memories

December 2, 2019

89% Match
Francesco Alemanno, Martino Centonze, Alberto Fachechi
Disordered Systems and Neura...
Machine Learning

Recently, Hopfield and Krotov introduced the concept of {\em dense associative memories} [DAM] (close to spin-glasses with $P$-wise interactions in a disordered statistical mechanical jargon): they proved a number of remarkable features these networks share and suggested their use to (partially) explain the success of the new generation of Artificial Intelligence. Thanks to a remarkable ante-litteram analysis by Baldi \& Venkatesh, among these properties, it is known these ne...

Find SimilarView on arXiv

Storage and Learning phase transitions in the Random-Features Hopfield Model

March 29, 2023

89% Match
Matteo Negri, Clarissa Lauditi, Gabriele Perugini, ... , Malatesta Enrico
Disordered Systems and Neura...
Machine Learning

The Hopfield model is a paradigmatic model of neural networks that has been analyzed for many decades in the statistical physics, neuroscience, and machine learning communities. Inspired by the manifold hypothesis in machine learning, we propose and investigate a generalization of the standard setting that we name Random-Features Hopfield Model. Here $P$ binary patterns of length $N$ are generated by applying to Gaussian vectors sampled in a latent space of dimension $D$ a ra...

Find SimilarView on arXiv

Effect of spatial correlations on Hopfield Neural Network and Dense Associative Memories

July 11, 2022

89% Match
Marzo Giordano De, Giulio Iannelli
Statistical Mechanics
Disordered Systems and Neura...

Hopfield model is one of the few neural networks for which analytical results can be obtained. However, most of them are derived under the assumption of random uncorrelated patterns, while in real life applications data to be stored show non-trivial correlations. In the present paper we study how the retrieval capability of the Hopfield network at null temperature is affected by spatial correlations in the data we feed to it. In particular, we use as patterns to be stored the...

Find SimilarView on arXiv

Uniform Memory Retrieval with Larger Capacity for Modern Hopfield Models

April 5, 2024

89% Match
Dennis Wu, Jerry Yao-Chieh Hu, ... , Liu Han
Machine Learning
Artificial Intelligence
Machine Learning

We propose a two-stage memory retrieval dynamics for modern Hopfield models, termed $\mathtt{U\text{-}Hop}$, with enhanced memory capacity. Our key contribution is a learnable feature map $\Phi$ which transforms the Hopfield energy function into a kernel space. This transformation ensures convergence between the local minima of energy and the fixed points of retrieval dynamics within the kernel space. Consequently, the kernel norm induced by $\Phi$ serves as a novel similarit...

Find SimilarView on arXiv

Enhanced storage capacity with errors in scale-free Hopfield neural networks: an analytical study

August 3, 2016

89% Match
Do-Hyun Kim, Jinha Park, B. Kahng
Disordered Systems and Neura...

The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of $O(N)$, where $N$ is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural networks has been further developed toward realistic neural networks using analog neur...

Find SimilarView on arXiv

Dense Hopfield Networks in the Teacher-Student Setting

January 8, 2024

89% Match
Robin Thériault, Daniele Tantari
Disordered Systems and Neura...
Machine Learning
Mathematical Physics

Dense Hopfield networks are known for their feature to prototype transition and adversarial robustness. However, previous theoretical studies have been mostly concerned with their storage capacity. We bridge this gap by studying the phase diagram of p-body Hopfield networks in the teacher-student setting of an unsupervised learning problem, uncovering ferromagnetic phases reminiscent of the prototype and feature learning regimes. On the Nishimori line, we find the critical si...

Find SimilarView on arXiv

Hopfield Networks is All You Need

July 16, 2020

89% Match
Hubert Ramsauer, Bernhard Schäfl, Johannes Lehner, Philipp Seidl, Michael Widrich, Thomas Adler, Lukas Gruber, Markus Holzleitner, Milena Pavlović, Geir Kjetil Sandve, Victor Greiff, David Kreil, Michael Kopp, Günter Klambauer, ... , Hochreiter Sepp
Neural and Evolutionary Comp...
Computation and Language
Machine Learning
Machine Learning

We introduce a modern Hopfield network with continuous states and a corresponding update rule. The new Hopfield network can store exponentially (with the dimension of the associative space) many patterns, retrieves the pattern with one update, and has exponentially small retrieval errors. It has three types of energy minima (fixed points of the update): (1) global fixed point averaging over all patterns, (2) metastable states averaging over a subset of patterns, and (3) fixed...

Find SimilarView on arXiv

Optimal storage capacity of quantum Hopfield neural networks

October 14, 2022

89% Match
Lukas Bödeker, Eliana Fiorelli, Markus Müller
Disordered Systems and Neura...

Quantum neural networks form one pillar of the emergent field of quantum machine learning. Here, quantum generalisations of classical networks realizing associative memories - capable of retrieving patterns, or memories, from corrupted initial states - have been proposed. It is a challenging open problem to analyze quantum associative memories with an extensive number of patterns, and to determine the maximal number of patterns the quantum networks can reliably store, i.e. th...

Find SimilarView on arXiv

Phase Diagram and Storage Capacity of Sequence Processing Neural Networks

May 6, 1998

88% Match
A. During, A. C. C. Coolen, D. Sherrington
Disordered Systems and Neura...

We solve the dynamics of Hopfield-type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence overlap and correlation- and response functions, in the thermodynamic limit. We calculate the ti...

Find SimilarView on arXiv

Robust exponential memory in Hopfield networks

November 17, 2014

88% Match
Christopher Hillar, Ngoc M. Tran
Adaptation and Self-Organizi...
Mathematical Physics
Neurons and Cognition

The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically-coupled McCulloch-Pitts neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems and store memories as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant me...

Find SimilarView on arXiv