ID: 2304.14964

The Exponential Capacity of Dense Associative Memories

April 28, 2023

View on ArXiv

Similar papers 2

Large Associative Memory Problem in Neurobiology and Machine Learning

August 16, 2020

88% Match
Dmitry Krotov, John Hopfield
Neurons and Cognition
Disordered Systems and Neura...
Computation and Language
Machine Learning
Machine Learning

Dense Associative Memories or modern Hopfield networks permit storage and reliable retrieval of an exponentially large (in the dimension of feature space) number of memories. At the same time, their naive implementation is non-biological, since it seemingly requires the existence of many-body synaptic junctions between the neurons. We show that these models are effective descriptions of a more microscopic (written in terms of biological degrees of freedom) theory that has add...

Find SimilarView on arXiv

Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model

March 15, 2004

88% Match
T. Uezu, A. Hirano, M. Okada
Disordered Systems and Neura...
Statistical Mechanics

We examine a previouly introduced attractor neural network model that explains the persistent activities of neurons in the anterior ventral temporal cortex of the brain. In this model, the coexistence of several attractors including correlated attractors was reported in the cases of finite and infinite loading. In this paper, by means of a statistical mechanical method, we study the statics and dynamics of the model in both finite and extensive loading, mainly focusing on the...

Find SimilarView on arXiv

Simplicial Hopfield networks

May 9, 2023

88% Match
Thomas F Burns, Tomoki Fukai
Neural and Evolutionary Comp...
Artificial Intelligence
Neurons and Cognition

Hopfield networks are artificial neural networks which store memory patterns on the states of their neurons by choosing recurrent connection weights and update rules such that the energy landscape of the network forms attractors around the memories. How many stable, sufficiently-attracting memory patterns can we store in such a network using $N$ neurons? The answer depends on the choice of weights and update rule. Inspired by setwise connectivity in biology, we extend Hopfiel...

Find SimilarView on arXiv

Retrieval Phase Diagrams of Non-monotonic Hopfield Networks

April 11, 1996

88% Match
Jun-ichi Department of Physics, Tokyo Institute of Technology and RIKEN Inoue
Disordered Systems and Neura...

We investigate the retrieval phase diagrams of an asynchronous fully-connected attractor network with non-monotonic transfer function by means of a mean-field approximation. We find for the noiseless zero-temperature case that this non-monotonic Hopfield network can store more patterns than a network with monotonic transfer function investigated by Amit et al. Properties of retrieval phase diagrams of non-monotonic networks agree with the results obtained by Nishimori and Opr...

Find SimilarView on arXiv

Dense Hebbian neural networks: a replica symmetric picture of supervised learning

November 25, 2022

88% Match
Elena Agliari, Linda Albanese, Francesco Alemanno, Andrea Alessandrelli, Adriano Barra, Fosca Giannotti, ... , Pedreschi Dino
Disordered Systems and Neura...
Machine Learning

We consider dense, associative neural-networks trained by a teacher (i.e., with supervision) and we investigate their computational capabilities analytically, via statistical-mechanics of spin glasses, and numerically, via Monte Carlo simulations. In particular, we obtain a phase diagram summarizing their performance as a function of the control parameters such as quality and quantity of the training dataset, network storage and noise, that is valid in the limit of large netw...

Find SimilarView on arXiv

On a model of associative memory with huge storage capacity

February 7, 2017

88% Match
Mete Demircigil, Judith Heusel, Matthias Löwe, ... , Vermet Franck
Probability

In [7] Krotov and Hopfield suggest a generalized version of the well-known Hopfield model of associative memory. In their version they consider a polynomial interaction function and claim that this increases the storage capacity of the model. We prove this claim and take the "limit" as the degree of the polynomial becomes infinite, i.e. an exponential interaction function. With this interaction we prove that model has an exponential storage capacity in the number of neurons, ...

Find SimilarView on arXiv

Capacity of the Hebbian-Hopfield network associative memory

March 4, 2024

88% Match
Mihailo Stojnic
stat.ML
cond-mat.dis-nn
cs.IT
cs.LG
math.IT
math.PR

In \cite{Hop82}, Hopfield introduced a \emph{Hebbian} learning rule based neural network model and suggested how it can efficiently operate as an associative memory. Studying random binary patterns, he also uncovered that, if a small fraction of errors is tolerated in the stored patterns retrieval, the capacity of the network (maximal number of memorized patterns, $m$) scales linearly with each pattern's size, $n$. Moreover, he famously predicted $\alpha_c=\lim_{n\rightarrow\...

Find SimilarView on arXiv

A Non-Binary Associative Memory with Exponential Pattern Retrieval Capacity and Iterative Learning: Extended Results

February 5, 2013

88% Match
Amir Hesam Salavati, K. Raj Kumar, Amin Shokrollahi
Neural and Evolutionary Comp...

We consider the problem of neural association for a network of non-binary neurons. Here, the task is to first memorize a set of patterns using a network of neurons whose states assume values from a finite number of integer levels. Later, the same network should be able to recall previously memorized patterns from their noisy versions. Prior work in this area consider storing a finite number of purely random patterns, and have shown that the pattern retrieval capacities (maxim...

Find SimilarView on arXiv

Beyond Scaling Laws: Understanding Transformer Performance with Associative Memory

May 14, 2024

88% Match
Xueyan Niu, Bo Bai, ... , Han Wei
Machine Learning

Increasing the size of a Transformer model does not always lead to enhanced performance. This phenomenon cannot be explained by the empirical scaling laws. Furthermore, improved generalization ability occurs as the model memorizes the training samples. We present a theoretical framework that sheds light on the memorization process and performance dynamics of transformer-based language models. We model the behavior of Transformers with associative memories using Hopfield netwo...

Find SimilarView on arXiv

On Computational Limits of Modern Hopfield Models: A Fine-Grained Complexity Analysis

February 7, 2024

88% Match
Jerry Yao-Chieh Hu, Thomas Lin, ... , Liu Han
Machine Learning
Artificial Intelligence
Machine Learning

We investigate the computational limits of the memory retrieval dynamics of modern Hopfield models from the fine-grained complexity analysis. Our key contribution is the characterization of a phase transition behavior in the efficiency of all possible modern Hopfield models based on the norm of patterns. Specifically, we establish an upper bound criterion for the norm of input query patterns and memory patterns. Only below this criterion, sub-quadratic (efficient) variants of...

Find SimilarView on arXiv