February 5, 2021
Similar papers 3
April 29, 2024
Brain-like self-assembled networks can infer and analyze information out of unorganized noisy signals with minimal power consumption. These networks are characterized by spatiotemporal avalanches and their crackling behavior, and their physical models are expected to predict and understand their computational capabilities. Here, we use a network theory-based approach to provide a physical model for percolative tunnelling networks, found in Ag-hBN system, consisting of nodes (...
November 25, 2013
Unconventional computing explores multi-scale platforms connecting molecular-scale devices into networks for the development of scalable neuromorphic architectures, often based on new materials and components with new functionalities. We review some work investigating the functionalities of locally connected networks of different types of switching elements as computational substrates. In particular, we discuss reservoir computing with networks of nonlinear nanoscale componen...
April 17, 2020
In 1982 John Hopfield published a neural network model for memory retrieval, a model that became a cornerstone in theoretical neuroscience. A key ingredient of the Hopfield model was the use of a network dynamics that is governed by a Lyapunov function. In a recent paper, Krotov and Hopfield showed how a Lyapunov function governs a biological plausible learning rule for the neural networks' connectivity. By doing so, they bring an intriguing approach to classification tasks, ...
April 12, 2022
When a physical system is driven away from equilibrium, the statistical distribution of its dynamical trajectories informs many of its physical properties. Characterizing the nature of the distribution of dynamical observables, such as a current or entropy production rate, has become a central problem in nonequilibrium statistical mechanics. Asymptotically, for a broad class of observables, the distribution of a given observable satisfies a large deviation principle when the ...
October 19, 2023
Unraveling the emergence of collective learning in systems of coupled artificial neural networks points to broader implications for machine learning, neuroscience, and society. Here we introduce a minimal model that condenses several recent decentralized algorithms by considering a competition between two terms: the local learning dynamics in the parameters of each neural network unit, and a diffusive coupling among units that tends to homogenize the parameters of the ensembl...
August 13, 2023
Finding the global minimum in complex networks while avoiding local minima is challenging in many types of networks. We study the dynamics of complex human networks and observed that humans have different methods to avoid local minima than other networks. Humans can change the coupling strength between them or change their tempo. This leads to different dynamics than other networks and makes human networks more robust and better resilient against perturbations. We observed hi...
June 26, 2017
Quantum annealers aim at solving non-convex optimization problems by exploiting cooperative tunneling effects to escape local minima. The underlying idea consists in designing a classical energy function whose ground states are the sought optimal solutions of the original optimization problem and add a controllable quantum transverse field to generate tunneling processes. A key challenge is to identify classes of non-convex optimization problems for which quantum annealing re...
August 31, 2020
We study the phase diagram of memristive circuit models in the replica-symmetric case using a novel Lyapunov function for the dynamics of these devices. Effectively, the model we propose is an Ising model with interacting quenched disorder, which we study at the first order in a control parameter. Notwithstanding these limitations, we find a complex phase diagram and a glass-ferromagnetic transition in the parameter space which generalizes earlier mean-field theory results fo...
April 8, 2024
The process of training an artificial neural network involves iteratively adapting its parameters so as to minimize the error of the network's prediction, when confronted with a learning task. This iterative change can be naturally interpreted as a trajectory in network space -- a time series of networks -- and thus the training algorithm (e.g. gradient descent optimization of a suitable loss function) can be interpreted as a dynamical system in graph space. In order to illus...
April 5, 2013
We show that memristive networks-namely networks of resistors with memory-can efficiently solve shortest-path optimization problems. Indeed, the presence of memory (time non-locality) promotes self organization of the network into the shortest possible path(s). We introduce a network entropy function to characterize the self-organized evolution, show the solution of the shortest-path problem and demonstrate the healing property of the solution path. Finally, we provide an alg...