ID: 2304.14964

The Exponential Capacity of Dense Associative Memories

April 28, 2023

View on ArXiv

Similar papers 3

The Retrieval Phase of the Hopfield Model: A Rigorous Analysis of the Overlap Distribution

July 25, 1995

88% Match
Anton WIAS-Berlin Bovier, Véronique CPT-Marseille Gayrard
Condensed Matter

Standard large deviation estimates or the use of the Hubbard-Stratonovich transformation reduce the analysis of the distribution of the overlap parameters essentially to that of an explicitly known random function $\Phi_{N,\b}$ on $\R^M$. In this article we present a rather careful study of the structure of the minima of this random function related to the retrieval of the stored patterns. We denote by $m^*(\b)$ the modulus of the spontaneous magnetization in the Curie-Weiss ...

Find SimilarView on arXiv

Associative networks with diluted patterns: dynamical analysis at low and medium load

May 10, 2014

88% Match
Silvia Bartolucci, Alessia Annibale
Disordered Systems and Neura...

In this work we solve the dynamics of pattern diluted associative networks, evolving via sequential Glauber update. We derive dynamical equations for the order parameters, that quantify the simultaneous pattern recall of the system, and analyse the nature and stability of the stationary solutions by means of linear stability analysis as well as Monte Carlo simulations. We investigate the parallel retrieval capabilities of the system in different regions of the phase space, in...

Find SimilarView on arXiv

High-Capacity Quantum Associative Memories

May 30, 2015

87% Match
M. Cristina Diamantini, Carlo A. Trugenberger
Quantum Physics

We review our models of quantum associative memories that represent the "quantization" of fully coupled neural networks like the Hopfield model. The idea is to replace the classical irreversible attractor dynamics driven by an Ising model with pattern-dependent weights by the reversible rotation of an input quantum state onto an output quantum state consisting of a linear superposition with probability amplitudes peaked on the stored pattern closest to the input in Hamming di...

Find SimilarView on arXiv

Hopfield model with planted patterns: a teacher-student self-supervised learning model

April 26, 2023

87% Match
Francesco Alemanno, Luca Camanzi, ... , Tantari Daniele
Disordered Systems and Neura...
Machine Learning
Mathematical Physics

While Hopfield networks are known as paradigmatic models for memory storage and retrieval, modern artificial intelligence systems mainly stand on the machine learning paradigm. We show that it is possible to formulate a teacher-student self-supervised learning problem with Boltzmann machines in terms of a suitable generalization of the Hopfield model with structured patterns, where the spin variables are the machine weights and patterns correspond to the training set's exampl...

Find SimilarView on arXiv

Exploring the Temperature-Dependent Phase Transition in Modern Hopfield Networks

November 30, 2023

87% Match
Felix Koulischer, Cédric Goemaere, der Meersch Tom van, ... , Demeester Thomas
Machine Learning
Disordered Systems and Neura...

The recent discovery of a connection between Transformers and Modern Hopfield Networks (MHNs) has reignited the study of neural networks from a physical energy-based perspective. This paper focuses on the pivotal effect of the inverse temperature hyperparameter $\beta$ on the distribution of energy minima of the MHN. To achieve this, the distribution of energy minima is tracked in a simplified MHN in which equidistant normalised patterns are stored. This network demonstrates ...

Find SimilarView on arXiv

A Hopfield-like model with complementary encodings of memories

February 9, 2023

87% Match
Louis Kang, Taro Toyoizumi
Neurons and Cognition
Disordered Systems and Neura...

We present a Hopfield-like autoassociative network for memories representing examples of concepts. Each memory is encoded by two activity patterns with complementary properties. The first is dense and correlated across examples within concepts, and the second is sparse and exhibits no correlation among examples. The network stores each memory as a linear combination of its encodings. During retrieval, the network recovers sparse or dense patterns with a high or low activity t...

Find SimilarView on arXiv

Bridging Associative Memory and Probabilistic Modeling

February 15, 2024

87% Match
Rylan Schaeffer, Nika Zahedi, Mikail Khona, Dhruv Pai, Sang Truong, Yilun Du, Mitchell Ostrow, Sarthak Chandra, Andres Carranza, Ila Rani Fiete, ... , Koyejo Sanmi
Machine Learning

Associative memory and probabilistic modeling are two fundamental topics in artificial intelligence. The first studies recurrent neural networks designed to denoise, complete and retrieve data, whereas the second studies learning and sampling from probability distributions. Based on the observation that associative memory's energy functions can be seen as probabilistic modeling's negative log likelihoods, we build a bridge between the two that enables useful flow of ideas in ...

Find SimilarView on arXiv

Nonparametric Modern Hopfield Models

April 5, 2024

87% Match
Jerry Yao-Chieh Hu, Bo-Yu Chen, Dennis Wu, ... , Liu Han
Machine Learning
Artificial Intelligence
Machine Learning
Neural and Evolutionary Comp...

We present a nonparametric construction for deep learning compatible modern Hopfield models and utilize this framework to debut an efficient variant. Our key contribution stems from interpreting the memory storage and retrieval processes in modern Hopfield models as a nonparametric regression problem subject to a set of query-memory pairs. Crucially, our framework not only recovers the known results from the original dense modern Hopfield model but also fills the void in the ...

Find SimilarView on arXiv

Neural Distributed Autoassociative Memories: A Survey

September 4, 2017

87% Match
V. I. Gritsenko, D. A. Rachkovskij, A. A. Frolov, R. Gayler, ... , Osipov E.
Neural and Evolutionary Comp...

Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can b...

Find SimilarView on arXiv

Vector-Neuron Models of Associative Memory

December 24, 2004

87% Match
B. V. Kryzhanovsky, L. B. Litinskii, A. L. Mikaelian
Disordered Systems and Neura...

We consider two models of Hopfield-like associative memory with $q$-valued neurons: Potts-glass neural network (PGNN) and parametrical neural network (PNN). In these models neurons can be in more than two different states. The models have the record characteristics of its storage capacity and noise immunity, and significantly exceed the Hopfield model. We present a uniform formalism allowing us to describe both PNN and PGNN. This networks inherent mechanisms, responsible for ...

Find SimilarView on arXiv