April 18, 2019
We use the latest techniques in machine-learning to study whether from the landscape of Calabi-Yau manifolds one can distinguish elliptically fibred ones. Using the dataset of complete intersections in products of projective spaces (CICY3 and CICY4, totalling about a million manifolds) as a concrete playground, we find that a relatively simple neural network with forward-feeding multi-layers can very efficiently distinguish the elliptic fibrations, much more so than using the...
May 29, 2024
We numerically study whether there exist nowhere vanishing harmonic $1$-forms on the real locus of some carefully constructed examples of Calabi-Yau manifolds, which would then give rise to potentially new examples of $G_2$-manifolds and an explicit description of their metrics. We do this in two steps: first, we use a neural network to compute an approximate Calabi-Yau metric on each manifold. Second, we use another neural network to compute an approximately harmonic $1$-for...
June 15, 2005
We develop numerical algorithms for solving the Einstein equation on Calabi-Yau manifolds at arbitrary values of their complex structure and Kahler parameters. We show that Kahler geometry can be exploited for significant gains in computational efficiency. As a proof of principle, we apply our methods to a one-parameter family of K3 surfaces constructed as blow-ups of the T^4/Z_2 orbifold with many discrete symmetries. High-resolution metrics may be obtained on a time scale o...
November 30, 2022
Neural networks with PDEs embedded in their loss functions (physics-informed neural networks) are employed as a function approximators to find solutions to the Ricci flow (a curvature based evolution) of Riemannian metrics. A general method is developed and applied to the real torus. The validity of the solution is verified by comparing the time evolution of scalar curvature with that found using a standard PDE solver, which decreases to a constant value of 0 on the whole man...
July 27, 2020
We introduce a neural network inspired by Google's Inception model to compute the Hodge number $h^{1,1}$ of complete intersection Calabi-Yau (CICY) 3-folds. This architecture improves largely the accuracy of the predictions over existing results, giving already 97% of accuracy with just 30% of the data for training. Moreover, accuracy climbs to 99% when using 80% of the data for training. This proves that neural networks are a valuable resource to study geometric aspects in b...
January 5, 2020
In these lecture notes, we survey the landscape of Calabi-Yau threefolds, and the use of machine learning to explore it. We begin with the compact portion of the landscape, focusing in particular on complete intersection Calabi-Yau varieties (CICYs) and elliptic fibrations. Non-compact Calabi-Yau manifolds are manifest in Type II superstring theories, they arise as representation varieties of quivers, used to describe gauge theories in the bulk familiar four dimensions. Final...
September 21, 2022
Generalized Complete Intersection Calabi-Yau Manifold (gCICY) is a new construction of Calabi-Yau manifolds established recently. However, the generation of new gCICYs using standard algebraic method is very laborious. Due to this complexity, the number of gCICYs and their classification still remain unknown. In this paper, we try to make some progress in this direction using neural network. The results showed that our trained models can have a high precision on the existing ...
June 11, 2017
We employ machine learning techniques to investigate the volume minimum of Sasaki-Einstein base manifolds of non-compact toric Calabi-Yau 3-folds. We find that the minimum volume can be approximated via a second order multiple linear regression on standard topological quantities obtained from the corresponding toric diagram. The approximation improves further after invoking a convolutional neural network with the full toric diagram of the Calabi-Yau 3-folds as the input. We a...
July 28, 2015
This is a survey article of the recent progresses on the metric behaviour of Ricci-flat K\"{a}hler-Einstein metrics along degenerations of Calabi-Yau manifolds.
March 4, 2013
We describe four algorithms for neural network training, each adapted to different scalability constraints. These algorithms are mathematically principled and invariant under a number of transformations in data and network representation, from which performance is thus independent. These algorithms are obtained from the setting of differential geometry, and are based on either the natural gradient using the Fisher information matrix, or on Hessian methods, scaled down in a sp...