March 30, 2020
We study machine learning of phenomenologically relevant properties of string compactifications, which arise in the context of heterotic line bundle models. Both supervised and unsupervised learning are considered. We find that, for a fixed compactification manifold, relatively small neural networks are capable of distinguishing consistent line bundle models with the correct gauge group and the correct chiral asymmetry from random models without these properties. The same distinction can also be achieved in the context of unsupervised learning, using an auto-encoder. Learning non-topological properties, specifically the number of Higgs multiplets, turns out to be more difficult, but is possible using sizeable networks and feature-enhanced data sets.
Similar papers 1
March 5, 2024
The landscape of low-energy effective field theories stemming from string theory is too vast for a systematic exploration. However, the meadows of the string landscape may be fertile ground for the application of machine learning techniques. Employing neural network learning may allow for inferring novel, undiscovered properties that consistent theories in the landscape should possess, or checking conjectural statements about alleged characteristics thereof. The aim of this w...
April 17, 2022
The goal of identifying the Standard Model of particle physics and its extensions within string theory has been one of the principal driving forces in string phenomenology. Recently, the incorporation of artificial intelligence in string theory and certain theoretical advancements have brought to light unexpected solutions to mathematical hurdles that have so far hindered progress in this direction. In this review we focus on model building efforts in the context of the $E_8\...
June 8, 2017
We propose a paradigm to deep-learn the ever-expanding databases which have emerged in mathematical physics and particle phenomenology, as diverse as the statistics of string vacua or combinatorial and algebraic geometry. As concrete examples, we establish multi-layer neural networks as both classifiers and predictors and train them with a host of available data ranging from Calabi-Yau manifolds and vector bundles, to quiver representations for gauge theories. We find that ev...
December 1, 2022
Artificial neural networks have become important to improve the search for admissible string compactifications and characterize them. In this paper we construct the heterotic orbiencoder, a general deep autoencoder to study heterotic orbifold models arising from various Abelian orbifold geometries. Our neural network can be easily trained to successfully encode the large parameter space of many orbifold geometries simultaneously, independently of the statistical dissimilariti...
March 26, 2020
We apply deep-learning techniques to the string landscape, in particular, $SO(32)$ heterotic string theory on simply-connected Calabi-Yau threefolds with line bundles. It turns out that three-generation models cluster in particular islands specified by deep autoencoder networks and k-means++ clustering. Especially, we explore mutual relations between model parameters and the cluster with densest three-generation models (called "3-generation island"). We find that the 3-genera...
July 3, 2017
We utilize machine learning to study the string landscape. Deep data dives and conjecture generation are proposed as useful frameworks for utilizing machine learning in the landscape, and examples of each are presented. A decision tree accurately predicts the number of weak Fano toric threefolds arising from reflexive polytopes, each of which determines a smooth F-theory compactification, and linear regression generates a previously proven conjecture for the gauge group rank ...
February 2, 2024
We present a numerical computation, based on neural network techniques, of the physical Yukawa couplings in a heterotic string theory compactification on a smooth Calabi-Yau threefold with non-standard embedding. The model belongs to a large class of heterotic line bundle models that have previously been identified and whose low-energy spectrum precisely matches that of the MSSM plus fields uncharged under the Standard Model group. The relevant quantities for the calculation,...
November 14, 2018
We use deep autoencoder neural networks to draw a chart of the heterotic $\mathbb{Z}_6$-II orbifold landscape. Even though the autoencoder is trained without knowing the phenomenological properties of the $\mathbb{Z}_6$-II orbifold models, we are able to identify fertile islands in this chart where phenomenologically promising models cluster. Then, we apply a decision tree to our chart in order to extract the defining properties of the fertile islands. Based on this informati...
January 14, 2019
Systematic classification of Z2xZ2 orbifold compactifications of the heterotic-string was pursued by using its free fermion formulation. The method entails random generation of string vacua and analysis of their entire spectra, and led to discovery of spinor-vector duality and three generation exophobic string vacua. The classification was performed for string vacua with unbroken SO(10) GUT symmetry, and progressively extended to models in which the SO(10) symmetry is broken ...
August 16, 2021
We use reinforcement learning as a means of constructing string compactifications with prescribed properties. Specifically, we study heterotic SO(10) GUT models on Calabi-Yau three-folds with monad bundles, in search of phenomenologically promising examples. Due to the vast number of bundles and the sparseness of viable choices, methods based on systematic scanning are not suitable for this class of models. By focusing on two specific manifolds with Picard numbers two and thr...