December 9, 2020
Similar papers 2
December 7, 2018
We present a pedagogical introduction to the recent advances in the computational geometry, physical implications, and data science of Calabi-Yau manifolds. Aimed at the beginning research student and using Calabi-Yau spaces as an exciting play-ground, we intend to teach some mathematics to the budding physicist, some physics to the budding mathematician, and some machine-learning to both. Based on various lecture series, colloquia and seminars given by the author in the past...
December 8, 2020
We use machine learning to approximate Calabi-Yau and SU(3)-structure metrics, including for the first time complex structure moduli dependence. Our new methods furthermore improve existing numerical approximations in terms of accuracy and speed. Knowing these metrics has numerous applications, ranging from computations of crucial aspects of the effective field theory of string compactifications such as the canonical normalizations for Yukawa couplings, and the massive string...
December 23, 2019
We discuss the extent to which numerical techniques for computing approximations to Ricci-flat metrics can be used to investigate hierarchies of curvature scales on Calabi-Yau manifolds. Control of such hierarchies is integral to the validity of curvature expansions in string effective theories. Nevertheless, for seemingly generic points in moduli space it can be difficult to analytically determine if there might be a highly curved region localized somewhere on the Calabi-Yau...
October 24, 2021
We study the use of machine learning for finding numerical hermitian Yang-Mills connections on line bundles over Calabi-Yau manifolds. Defining an appropriate loss function and focusing on the examples of an elliptic curve, a K3 surface and a quintic threefold, we show that neural networks can be trained to give a close approximation to hermitian Yang-Mills connections.
May 9, 2021
A very popular model in machine learning is the feedforward neural network (FFN). The FFN can approximate general functions and mitigate the curse of dimensionality. Here we introduce FFNs which represent sections of holomorphic line bundles on complex manifolds, and ask some questions about their approximating power. We also explain formal similarities between the standard approach to supervised learning and the problem of finding numerical Ricci flat K\"ahler metrics, which...
November 22, 2022
We present the first version of CYJAX, a package for machine learning Calabi-Yau metrics using JAX. It is meant to be accessible both as a top-level tool and as a library of modular functions. CYJAX is currently centered around the algebraic ansatz for the K\"ahler potential which automatically satisfies K\"ahlerity and compatibility on patch overlaps. As of now, this implementation is limited to varieties defined by a single defining equation on one complex projective space....
August 4, 2021
We continue earlier efforts in computing the dimensions of tangent space cohomologies of Calabi-Yau manifolds using deep learning. In this paper, we consider the dataset of all Calabi-Yau four-folds constructed as complete intersections in products of projective spaces. Employing neural networks inspired by state-of-the-art computer vision architectures, we improve earlier benchmarks and demonstrate that all four non-trivial Hodge numbers can be learned at the same time using...
May 29, 2024
We numerically study whether there exist nowhere vanishing harmonic $1$-forms on the real locus of some carefully constructed examples of Calabi-Yau manifolds, which would then give rise to potentially new examples of $G_2$-manifolds and an explicit description of their metrics. We do this in two steps: first, we use a neural network to compute an approximate Calabi-Yau metric on each manifold. Second, we use another neural network to compute an approximately harmonic $1$-for...
March 10, 2015
Yau proved an existence theorem for Ricci-flat K\"ahler metrics in the 1970's, but we still have no closed form expressions for them. Nevertheless there are several ways to get approximate expressions, both numerical and analytical. We survey some of this work and explain how it can be used to obtain physical predictions from superstring theory.
November 30, 2022
Neural networks with PDEs embedded in their loss functions (physics-informed neural networks) are employed as a function approximators to find solutions to the Ricci flow (a curvature based evolution) of Riemannian metrics. A general method is developed and applied to the real torus. The validity of the solution is verified by comparing the time evolution of scalar curvature with that found using a standard PDE solver, which decreases to a constant value of 0 on the whole man...