January 5, 2022
On the long-established classification problems in general relativity we take a novel perspective by adopting fruitful techniques from machine learning and modern data-science. In particular, we model Petrov's classification of spacetimes, and show that a feed-forward neural network can achieve high degree of success. We also show how data visualization techniques with dimensionality reduction can help analyze the underlying patterns in the structure of the different types of spacetimes.
Similar papers 1
October 30, 2022
Due to the effectiveness of using machine learning in physics, it has been widely received increased attention in the literature. However, the notion of applying physics in machine learning has not been given much awareness to. This work is a hybrid of physics and machine learning where concepts of physics are used in machine learning. We propose the supervised Gravitational Dimensionality Reduction (GDR) algorithm where the data points of every class are moved to each other ...
December 15, 2005
We briefly overview the Petrov classification in four dimensions and its generalization to higher dimensions.
March 22, 2023
We survey some recent applications of machine learning to problems in geometry and theoretical physics. Pure mathematical data has been compiled over the last few decades by the community and experiments in supervised, semi-supervised and unsupervised machine learning have found surprising success. We thus advocate the programme of machine learning mathematical structures, and formulating conjectures via pattern recognition, in other words using artificial intelligence to hel...
April 21, 2022
We review some recent applications of machine learning to algebraic geometry and physics. Since problems in algebraic geometry can typically be reformulated as mappings between tensors, this makes them particularly amenable to supervised learning. Additionally, unsupervised methods can provide insight into the structure of such geometrical data. At the heart of this programme is the question of how geometry can be machine learned, and indeed how AI helps one to do mathematics...
Traditional approaches to the study of the dynamics of spacetime curvature in a very real sense hide the intricacies of the nonlinear regime. Whether it be huge formulae, or mountains of numerical data, standard methods of presentation make little use of our remarkable skill, as humans, at pattern recognition. Here we introduce a new approach to the visualization of spacetime curvature. We examine the flows associated with the gradient fields of invariants derived from the sp...
March 5, 2024
The landscape of low-energy effective field theories stemming from string theory is too vast for a systematic exploration. However, the meadows of the string landscape may be fertile ground for the application of machine learning techniques. Employing neural network learning may allow for inferring novel, undiscovered properties that consistent theories in the landscape should possess, or checking conjectural statements about alleged characteristics thereof. The aim of this w...
June 8, 2017
We propose a paradigm to deep-learn the ever-expanding databases which have emerged in mathematical physics and particle phenomenology, as diverse as the statistics of string vacua or combinatorial and algebraic geometry. As concrete examples, we establish multi-layer neural networks as both classifiers and predictors and train them with a host of available data ranging from Calabi-Yau manifolds and vector bundles, to quiver representations for gauge theories. We find that ev...
April 7, 2010
We provide an introduction to selected recent advances in the mathematical understanding of Einstein's theory of gravitation.
December 2, 2021
A review of selected topics in mathematical general relativity
We examine a subset of spatially homogenous and anisotropic solutions to Einstein's field equations: the Bianchi Type A models, and show that they can be written as a continuous-time recurrent neural network (CTRNN). This reformulation of Einstein's equations allows one to write potentially complicated nonlinear equations as a simpler dynamical system consisting of linear combinations of the neural network weights and logistic sigmoid activation functions. The CTRNN itself is...