ID: 2102.03385

Global minimization via classical tunneling assisted by collective force field formation

February 5, 2021

View on ArXiv

Similar papers 2

From complex to simple : hierarchical free-energy landscape renormalized in deep neural networks

October 22, 2019

83% Match
Hajime Yoshino
Disordered Systems and Neura...
Statistical Mechanics
Machine Learning
Machine Learning

We develop a statistical mechanical approach based on the replica method to study the design space of deep and wide neural networks constrained to meet a large number of training data. Specifically, we analyze the configuration space of the synaptic weights and neurons in the hidden layers in a simple feed-forward perceptron network for two scenarios: a setting with random inputs/outputs and a teacher-student setting. By increasing the strength of constraints,~i.e. increasing...

Find SimilarView on arXiv

Memristive Networks: from Graph Theory to Statistical Physics

August 21, 2019

83% Match
Ana Zegarac, Francesco Caravelli
Disordered Systems and Neura...
Statistical Mechanics
Adaptation and Self-Organizi...

We provide an introduction to a very specific toy model of memristive networks, for which an exact differential equation for the internal memory which contains the Kirchhoff laws is known. In particular, we highlight how the circuit topology enters the dynamics via an analysis of directed graph. We try to highlight in particular the connection between the asymptotic states of memristors and the Ising model, and the relation to the dynamics and statics of disordered systems.

Find SimilarView on arXiv

Emergence of Winner-takes-all Connectivity Paths in Random Nanowire Networks

April 26, 2018

83% Match
Hugh G. Manning, Fabio Niosi, Rocha Claudia Gomes da, Allen T. Bellew, Colin O'Callaghan, Subhajit Biswas, Patrick Flowers, Ben J. Wiley, Justin D. Holmes, ... , Boland John J.
Materials Science
Mesoscale and Nanoscale Phys...

Nanowire networks are promising memristive architectures for neuromorphic applications due to their connectivity and neurosynaptic-like behaviours. Here, we demonstrate a self-similar scaling of the conductance of networks and the junctions that comprise them. We show this behavior is an emergent property of any junction-dominated network. A particular class of junctions naturally leads to the emergence of conductance plateaus and a "winner-takes-all" conducting path that spa...

Find SimilarView on arXiv

Step Size Matters in Deep Learning

May 22, 2018

83% Match
Kamil Nar, S. Shankar Sastry
Machine Learning
Optimization and Control
Machine Learning

Training a neural network with the gradient descent algorithm gives rise to a discrete-time nonlinear dynamical system. Consequently, behaviors that are typically observed in these systems emerge during training, such as convergence to an orbit but not to a fixed point or dependence of convergence on the initialization. Step size of the algorithm plays a critical role in these behaviors: it determines the subset of the local optima that the algorithm can converge to, and it s...

Find SimilarView on arXiv

An Adaptive Synaptic Array using Fowler-Nordheim Dynamic Analog Memory

April 13, 2021

83% Match
Darshit Mehta, Kenji Aono, Shantanu Chakrabartty
Neural and Evolutionary Comp...
Systems and Control
Systems and Control

In this paper we present a synaptic array that uses dynamical states to implement an analog memory for energy-efficient training of machine learning (ML) systems. Each of the analog memory elements is a micro-dynamical system that is driven by the physics of Fowler-Nordheim (FN) quantum tunneling, whereas the system level learning modulates the state trajectory of the memory ensembles towards the optimal solution. We show that the extrinsic energy required for modulation can ...

Find SimilarView on arXiv

Complexity from Adaptive-Symmetries Breaking: Global Minima in the Statistical Mechanics of Deep Neural Networks

January 3, 2022

83% Match
Shawn W. M. Li
Machine Learning
Biological Physics
Data Analysis, Statistics an...

An antithetical concept, adaptive symmetry, to conservative symmetry in physics is proposed to understand the deep neural networks (DNNs). It characterizes the invariance of variance, where a biotic system explores different pathways of evolution with equal probability in absence of feedback signals, and complex functional structure emerges from quantitative accumulation of adaptive-symmetries breaking in response to feedback signals. Theoretically and experimentally, we char...

Find SimilarView on arXiv

Data-driven effective model shows a liquid-like deep learning

July 16, 2020

82% Match
Wenxuan Zou, Haiping Huang
Machine Learning
Disordered Systems and Neura...
Statistical Mechanics
Machine Learning

The geometric structure of an optimization landscape is argued to be fundamentally important to support the success of deep neural network learning. A direct computation of the landscape beyond two layers is hard. Therefore, to capture the global view of the landscape, an interpretable model of the network-parameter (or weight) space must be established. However, the model is lacking so far. Furthermore, it remains unknown what the landscape looks like for deep networks of bi...

Find SimilarView on arXiv

Trajectories entropy in dynamical graphs with memory

November 23, 2015

82% Match
Francesco Caravelli
Physics and Society
Disordered Systems and Neura...

In this paper we investigate the application of non-local graph entropy to evolving and dynamical graphs. The measure is based upon the notion of Markov diffusion on a graph, and relies on the entropy applied to trajectories originating at a specific node. In particular, we study the model of reinforcement-decay graph dynamics, which leads to scale free graphs. We find that the node entropy characterizes the structure of the network in the two parameter phase-space describing...

Find SimilarView on arXiv

Unreasonable Effectiveness of Learning Neural Networks: From Accessible States and Robust Ensembles to Basic Algorithmic Schemes

May 20, 2016

82% Match
Carlo Baldassi, Christian Borgs, Jennifer Chayes, Alessandro Ingrosso, Carlo Lucibello, ... , Zecchina Riccardo
Machine Learning
Disordered Systems and Neura...
Machine Learning

In artificial neural networks, learning from data is a computationally demanding task in which a large number of connection weights are iteratively tuned through stochastic-gradient-based heuristic processes over a cost-function. It is not well understood how learning occurs in these systems, in particular how they avoid getting trapped in configurations with poor computational performance. Here we study the difficult case of networks with discrete weights, where the optimiza...

Find SimilarView on arXiv

Mean Field Theory of Dynamical Systems Driven by External Signals

October 31, 2012

82% Match
Marc Massar, Serge Massar
Chaotic Dynamics
Disordered Systems and Neura...
Artificial Intelligence

Dynamical systems driven by strong external signals are ubiquituous in nature and engineering. Here we study "echo state networks", networks of a large number of randomly connected nodes, which represent a simple model of a neural network, and have important applications in machine learning. We develop a mean field theory of echo state networks. The dynamics of the network is captured by the evolution law, similar to a logistic map, for a single collective variable. When the ...

Find SimilarView on arXiv