ID: 2102.03385

Global minimization via classical tunneling assisted by collective force field formation

February 5, 2021

View on ArXiv

Similar papers 3

Network-theory based modeling of avalanche dynamics in percolative tunnelling networks

April 29, 2024

82% Match
Vivek Dey, Steffen Kampman, Rafael Gutierrez, ... , Nukala Pavan
Disordered Systems and Neura...

Brain-like self-assembled networks can infer and analyze information out of unorganized noisy signals with minimal power consumption. These networks are characterized by spatiotemporal avalanches and their crackling behavior, and their physical models are expected to predict and understand their computational capabilities. Here, we use a network theory-based approach to provide a physical model for percolative tunnelling networks, found in Ag-hBN system, consisting of nodes (...

Find SimilarView on arXiv

Toward bio-inspired information processing with networks of nano-scale switching elements

November 25, 2013

82% Match
Zoran Konkoli, Göran Wendin
Emerging Technologies

Unconventional computing explores multi-scale platforms connecting molecular-scale devices into networks for the development of scalable neuromorphic architectures, often based on new materials and components with new functionalities. We review some work investigating the functionalities of locally connected networks of different types of switching elements as computational substrates. In particular, we discuss reservoir computing with networks of nonlinear nanoscale componen...

Find SimilarView on arXiv

Network Dynamics Governed by Lyapunov Functions: From Memory to Classification

April 17, 2020

82% Match
Merav Stern, Eric Shea-Brown
Neurons and Cognition
Disordered Systems and Neura...

In 1982 John Hopfield published a neural network model for memory retrieval, a model that became a cornerstone in theoretical neuroscience. A key ingredient of the Hopfield model was the use of a network dynamics that is governed by a Lyapunov function. In a recent paper, Krotov and Hopfield showed how a Lyapunov function governs a biological plausible learning rule for the neural networks' connectivity. By doing so, they bring an intriguing approach to classification tasks, ...

Find SimilarView on arXiv

Physics-informed graph neural networks enhance scalability of variational nonequilibrium optimal control

April 12, 2022

82% Match
Jiawei Yan, Grant M. Rotskoff
Statistical Mechanics

When a physical system is driven away from equilibrium, the statistical distribution of its dynamical trajectories informs many of its physical properties. Characterizing the nature of the distribution of dynamical observables, such as a current or entropy production rate, has become a central problem in nonequilibrium statistical mechanics. Asymptotically, for a broad class of observables, the distribution of a given observable satisfies a large deviation principle when the ...

Find SimilarView on arXiv

An effective theory of collective deep learning

October 19, 2023

82% Match
Lluís Arola-Fernández, Lucas Lacasa
Physics and Society
Disordered Systems and Neura...
Artificial Intelligence
Machine Learning
Adaptation and Self-Organizi...

Unraveling the emergence of collective learning in systems of coupled artificial neural networks points to broader implications for machine learning, neuroscience, and society. Here we introduce a minimal model that condenses several recent decentralized algorithms by considering a competition between two terms: the local learning dynamics in the parameters of each neural network unit, and a diffusive coupling among units that tends to homogenize the parameters of the ensembl...

Find SimilarView on arXiv

How synchronized human networks escape local minima

August 13, 2023

82% Match
Elad Schniderman, Yahav Avraham, Shir Shahal, Hamootal Duadi, ... , Fridman Moti
Physics and Society

Finding the global minimum in complex networks while avoiding local minima is challenging in many types of networks. We study the dynamics of complex human networks and observed that humans have different methods to avoid local minima than other networks. Humans can change the coupling strength between them or change their tempo. This leads to different dynamics than other networks and makes human networks more robust and better resilient against perturbations. We observed hi...

Find SimilarView on arXiv

Efficiency of quantum versus classical annealing in non-convex learning problems

June 26, 2017

82% Match
Carlo Baldassi, Riccardo Zecchina
Disordered Systems and Neura...
Machine Learning
Machine Learning

Quantum annealers aim at solving non-convex optimization problems by exploiting cooperative tunneling effects to escape local minima. The underlying idea consists in designing a classical energy function whose ground states are the sought optimal solutions of the original optimization problem and add a controllable quantum transverse field to generate tunneling processes. A key challenge is to identify classes of non-convex optimization problems for which quantum annealing re...

Find SimilarView on arXiv

Phases of memristive circuits via an interacting disorder approach

August 31, 2020

82% Match
Francesco Caravelli, Forrest C. Sheldon
Statistical Mechanics
Disordered Systems and Neura...

We study the phase diagram of memristive circuit models in the replica-symmetric case using a novel Lyapunov function for the dynamics of these devices. Effectively, the model we propose is an Ising model with interacting quenched disorder, which we study at the first order in a control parameter. Notwithstanding these limitations, we find a complex phase diagram and a glass-ferromagnetic transition in the parameter space which generalizes earlier mean-field theory results fo...

Find SimilarView on arXiv

Dynamical stability and chaos in artificial neural network trajectories along training

April 8, 2024

82% Match
Kaloyan Danovski, Miguel C. Soriano, Lucas Lacasa
Machine Learning
Disordered Systems and Neura...
Chaotic Dynamics
Data Analysis, Statistics an...

The process of training an artificial neural network involves iteratively adapting its parameters so as to minimize the error of the network's prediction, when confronted with a learning task. This iterative change can be naturally interpreted as a trajectory in network space -- a time series of networks -- and thus the training algorithm (e.g. gradient descent optimization of a suitable loss function) can be interpreted as a dynamical system in graph space. In order to illus...

Find SimilarView on arXiv

Self-organization and solution of shortest-path optimization problems with memristive networks

April 5, 2013

82% Match
Yuriy V. Pershin, Ventra Massimiliano Di
Emerging Technologies
Disordered Systems and Neura...
Computational Physics

We show that memristive networks-namely networks of resistors with memory-can efficiently solve shortest-path optimization problems. Indeed, the presence of memory (time non-locality) promotes self organization of the network into the shortest possible path(s). We introduce a network entropy function to characterize the self-organized evolution, show the solution of the shortest-path problem and demonstrate the healing property of the solution path. Finally, we provide an alg...

Find SimilarView on arXiv