January 8, 2020
Similar papers 2
March 29, 2021
We present an approach to cosmology in which the Universe learns its own physical laws. It does so by exploring a landscape of possible laws, which we express as a certain class of matrix models. We discover maps that put each of these matrix models in correspondence with both a gauge/gravity theory and a mathematical model of a learning machine, such as a deep recurrent, cyclic neural network. This establishes a correspondence between each solution of the physical theory and...
March 25, 2024
Effective learning in neuronal networks requires the adaptation of individual synapses given their relative contribution to solving a task. However, physical neuronal systems -- whether biological or artificial -- are constrained by spatio-temporal locality. How such networks can perform efficient credit assignment, remains, to a large extent, an open question. In Machine Learning, the answer is almost universally given by the error backpropagation algorithm, through both spa...
August 29, 2014
The demand to obtain answers to highly complex problems within strong-field gravity has been met with significant progress in the numerical solution of Einstein's equations - along with some spectacular results - in various setups. We review techniques for solving Einstein's equations in generic spacetimes, focusing on fully nonlinear evolutions but also on how to benchmark those results with perturbative approaches. The results address problems in high-energy physics, hologr...
February 9, 2023
In this paper, we introduce a numerical method based on Artificial Neural Networks (ANNs) for the analysis of black hole solutions to the Einstein-axion-dilaton system in a high dimensional parabolic class. Leveraging a profile root-finding technique based on General Relativity we describe an ANN solver to directly tackle the system of ordinary differential equations. Through our extensive numerical analysis, we demonstrate, for the first time, that there is no self-similar c...
February 11, 2021
In this pilot study, we investigate the use of a deep learning (DL) model to temporally evolve the dynamics of gas accreting onto a black hole in the form of a radiatively inefficient accretion flow (RIAF). We have trained a machine to forecast such a spatiotemporally chaotic system -- i.e. black hole weather forecasting -- using a convolutional neural network (CNN) and a training dataset which consists of numerical solutions of the hydrodynamical equations, for a range of in...
October 7, 2023
Training neural networks sequentially in time to approximate solution fields of time-dependent partial differential equations can be beneficial for preserving causality and other physics properties; however, the sequential-in-time training is numerically challenging because training errors quickly accumulate and amplify over time. This work introduces Neural Galerkin schemes that update randomized sparse subsets of network parameters at each time step. The randomization avoid...
November 28, 2017
We introduce physics informed neural networks -- neural networks that are trained to solve supervised learning tasks while respecting any given law of physics described by general nonlinear partial differential equations. In this two part treatise, we present our developments in the context of solving two main classes of problems: data-driven solution and data-driven discovery of partial differential equations. Depending on the nature and arrangement of the available data, we...
December 17, 2015
We review several of the most widely used techniques for training recurrent neural networks to approximate dynamical systems, then describe a novel algorithm for this task. The algorithm is based on an earlier theoretical result that guarantees the quality of the network approximation. We show that a feedforward neural network can be trained on the vector field representation of a given dynamical system using backpropagation, then recast, using matrix manipulations, as a recu...
March 18, 2021
In the last decade, deep learning has become a major component of artificial intelligence. The workhorse of deep learning is the optimization of loss functions by stochastic gradient descent (SGD). Traditionally in deep learning, neural networks are differentiable mathematical functions, and the loss gradients required for SGD are computed with the backpropagation algorithm. However, the computer architectures on which these neural networks are implemented and trained suffer ...
April 17, 2024
In this paper, we apply a novel approach based on physics-informed neural networks to the computation of quasinormal modes of black hole solutions in modified gravity. In particular, we focus on the case of Einstein-scalar-Gauss-Bonnet theory, with several choices of the coupling function between the scalar field and the Gauss-Bonnet invariant. This type of calculation introduces a number of challenges with respect to the case of General Relativity, mainly due to the extra co...