December 8, 2020
We use machine learning to approximate Calabi-Yau and SU(3)-structure metrics, including for the first time complex structure moduli dependence. Our new methods furthermore improve existing numerical approximations in terms of accuracy and speed. Knowing these metrics has numerous applications, ranging from computations of crucial aspects of the effective field theory of string compactifications such as the canonical normalizations for Yukawa couplings, and the massive string spectrum which plays a crucial role in swampland conjectures, to mirror symmetry and the SYZ conjecture. In the case of SU(3) structure, our machine learning approach allows us to engineer metrics with certain torsion properties. Our methods are demonstrated for Calabi-Yau and SU(3)-structure manifolds based on a one-parameter family of quintic hypersurfaces in $\mathbb{P}^4.$
Similar papers 1
October 18, 2019
We apply machine learning to the problem of finding numerical Calabi-Yau metrics. Building on Donaldson's algorithm for calculating balanced metrics on K\"ahler manifolds, we combine conventional curve fitting and machine-learning techniques to numerically approximate Ricci-flat metrics. We show that machine learning is able to predict the Calabi-Yau metric and quantities associated with it, such as its determinant, having seen only a small sample of training data. Using this...
December 28, 2023
Calabi-Yau (CY) manifolds play a ubiquitous role in string theory. As a supersymmetry-preserving choice for the 6 extra compact dimensions of superstring compactifications, these spaces provide an arena in which to explore the rich interplay between physics and geometry. These lectures will focus on compact CY manifolds and the long standing problem of determining their Ricci flat metrics. Despite powerful existence theorems, no analytic expressions for these metrics are know...
December 20, 2021
We apply machine learning to the problem of finding numerical Calabi-Yau metrics. We extend previous work on learning approximate Ricci-flat metrics calculated using Donaldson's algorithm to the much more accurate "optimal" metrics of Headrick and Nassar. We show that machine learning is able to predict the K\"ahler potential of a Calabi-Yau metric having seen only a small sample of training data.
December 7, 2018
We present a pedagogical introduction to the recent advances in the computational geometry, physical implications, and data science of Calabi-Yau manifolds. Aimed at the beginning research student and using Calabi-Yau spaces as an exciting play-ground, we intend to teach some mathematics to the budding physicist, some physics to the budding mathematician, and some machine-learning to both. Based on various lecture series, colloquia and seminars given by the author in the past...
November 2, 2021
We present a new machine learning library for computing metrics of string compactification spaces. We benchmark the performance on Monte-Carlo sampled integrals against previous numerical approximations and find that our neural networks are more sample- and computation-efficient. We are the first to provide the possibility to compute these metrics for arbitrary, user-specified shape and size parameters of the compact space and observe a linear relation between optimization of...
December 9, 2020
We propose machine learning inspired methods for computing numerical Calabi-Yau (Ricci flat K\"ahler) metrics, and implement them using Tensorflow/Keras. We compare them with previous work, and find that they are far more accurate for manifolds with little or no symmetry. We also discuss issues such as overparameterization and choice of optimization methods.
July 30, 2020
We revisit the question of predicting both Hodge numbers $h^{1,1}$ and $h^{2,1}$ of complete intersection Calabi-Yau (CICY) 3-folds using machine learning (ML), considering both the old and new datasets built respectively by Candelas-Dale-Lutken-Schimmrigk / Green-H\"ubsch-Lutken and by Anderson-Gao-Gray-Lee. In real world applications, implementing a ML system rarely reduces to feed the brute data to the algorithm. Instead, the typical workflow starts with an exploratory dat...
December 12, 2021
We revisit the classic database of weighted-P4s which admit Calabi-Yau 3-fold hypersurfaces equipped with a diverse set of tools from the machine-learning toolbox. Unsupervised techniques identify an unanticipated almost linear dependence of the topological data on the weights. This then allows us to identify a previously unnoticed clustering in the Calabi-Yau data. Supervised techniques are successful in predicting the topological parameters of the hypersurface from its weig...
January 5, 2020
In these lecture notes, we survey the landscape of Calabi-Yau threefolds, and the use of machine learning to explore it. We begin with the compact portion of the landscape, focusing in particular on complete intersection Calabi-Yau varieties (CICYs) and elliptic fibrations. Non-compact Calabi-Yau manifolds are manifest in Type II superstring theories, they arise as representation varieties of quivers, used to describe gauge theories in the bulk familiar four dimensions. Final...
December 31, 2020
Ricci flat metrics for Calabi-Yau threefolds are not known analytically. In this work, we employ techniques from machine learning to deduce numerical flat metrics for the Fermat quintic, for the Dwork quintic, and for the Tian-Yau manifold. This investigation employs a single neural network architecture that is capable of approximating Ricci flat Kaehler metrics for several Calabi-Yau manifolds of dimensions two and three. We show that measures that assess the Ricci flatness ...