April 28, 2023
Similar papers 3
July 25, 1995
Standard large deviation estimates or the use of the Hubbard-Stratonovich transformation reduce the analysis of the distribution of the overlap parameters essentially to that of an explicitly known random function $\Phi_{N,\b}$ on $\R^M$. In this article we present a rather careful study of the structure of the minima of this random function related to the retrieval of the stored patterns. We denote by $m^*(\b)$ the modulus of the spontaneous magnetization in the Curie-Weiss ...
May 10, 2014
In this work we solve the dynamics of pattern diluted associative networks, evolving via sequential Glauber update. We derive dynamical equations for the order parameters, that quantify the simultaneous pattern recall of the system, and analyse the nature and stability of the stationary solutions by means of linear stability analysis as well as Monte Carlo simulations. We investigate the parallel retrieval capabilities of the system in different regions of the phase space, in...
May 30, 2015
We review our models of quantum associative memories that represent the "quantization" of fully coupled neural networks like the Hopfield model. The idea is to replace the classical irreversible attractor dynamics driven by an Ising model with pattern-dependent weights by the reversible rotation of an input quantum state onto an output quantum state consisting of a linear superposition with probability amplitudes peaked on the stored pattern closest to the input in Hamming di...
April 26, 2023
While Hopfield networks are known as paradigmatic models for memory storage and retrieval, modern artificial intelligence systems mainly stand on the machine learning paradigm. We show that it is possible to formulate a teacher-student self-supervised learning problem with Boltzmann machines in terms of a suitable generalization of the Hopfield model with structured patterns, where the spin variables are the machine weights and patterns correspond to the training set's exampl...
November 30, 2023
The recent discovery of a connection between Transformers and Modern Hopfield Networks (MHNs) has reignited the study of neural networks from a physical energy-based perspective. This paper focuses on the pivotal effect of the inverse temperature hyperparameter $\beta$ on the distribution of energy minima of the MHN. To achieve this, the distribution of energy minima is tracked in a simplified MHN in which equidistant normalised patterns are stored. This network demonstrates ...
February 9, 2023
We present a Hopfield-like autoassociative network for memories representing examples of concepts. Each memory is encoded by two activity patterns with complementary properties. The first is dense and correlated across examples within concepts, and the second is sparse and exhibits no correlation among examples. The network stores each memory as a linear combination of its encodings. During retrieval, the network recovers sparse or dense patterns with a high or low activity t...
February 15, 2024
Associative memory and probabilistic modeling are two fundamental topics in artificial intelligence. The first studies recurrent neural networks designed to denoise, complete and retrieve data, whereas the second studies learning and sampling from probability distributions. Based on the observation that associative memory's energy functions can be seen as probabilistic modeling's negative log likelihoods, we build a bridge between the two that enables useful flow of ideas in ...
April 5, 2024
We present a nonparametric construction for deep learning compatible modern Hopfield models and utilize this framework to debut an efficient variant. Our key contribution stems from interpreting the memory storage and retrieval processes in modern Hopfield models as a nonparametric regression problem subject to a set of query-memory pairs. Crucially, our framework not only recovers the known results from the original dense modern Hopfield model but also fills the void in the ...
September 4, 2017
Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbors among vectors of high dimension. The purpose of this paper is to review models of autoassociative, distributed memory that can b...
December 24, 2004
We consider two models of Hopfield-like associative memory with $q$-valued neurons: Potts-glass neural network (PGNN) and parametrical neural network (PNN). In these models neurons can be in more than two different states. The models have the record characteristics of its storage capacity and noise immunity, and significantly exceed the Hopfield model. We present a uniform formalism allowing us to describe both PNN and PGNN. This networks inherent mechanisms, responsible for ...