November 17, 2022
Finding Ricci-flat (Calabi-Yau) metrics is a long standing problem in geometry with deep implications for string theory and phenomenology. A new attack on this problem uses neural networks to engineer approximations to the Calabi-Yau metric within a given K\"ahler class. In this paper we investigate numerical Ricci-flat metrics over smooth and singular K3 surfaces and Calabi-Yau threefolds. Using these Ricci-flat metric approximations for the Cefal\'u family of quartic twofol...
February 28, 2025
In this manuscript, we demonstrate, by using several regression techniques, that one can machine learn the other independent Hodge numbers of complete intersection Calabi-Yau four-folds and five-folds in terms of $h^{1,1}$ and $h^{2,1}$. Consequently, we combine the Hodge numbers $h^{1,1}$ and $h^{2,1}$ from the complete intersection of Calabi-Yau three-folds, four-folds, and five-folds into a single dataset. We then implemented various classification algorithms on this datas...
June 2, 2019
We analyse line bundle cohomologies on all favourable co-dimension two Complete Intersection Calabi Yau (CICY) manifolds of Picard number two. Our results provide further evidence that the cohomology dimensions of such line bundles are given by analytic expressions, which change between regions in the line bundle charge space. This agrees with recent observations of CY line bundles presented in Refs [1,2]. In many cases, the expressions for bundle cohomology dimensions are po...
April 18, 2019
We use the latest techniques in machine-learning to study whether from the landscape of Calabi-Yau manifolds one can distinguish elliptically fibred ones. Using the dataset of complete intersections in products of projective spaces (CICY3 and CICY4, totalling about a million manifolds) as a concrete playground, we find that a relatively simple neural network with forward-feeding multi-layers can very efficiently distinguish the elliptic fibrations, much more so than using the...
June 8, 2017
We propose a paradigm to deep-learn the ever-expanding databases which have emerged in mathematical physics and particle phenomenology, as diverse as the statistics of string vacua or combinatorial and algebraic geometry. As concrete examples, we establish multi-layer neural networks as both classifiers and predictors and train them with a host of available data ranging from Calabi-Yau manifolds and vector bundles, to quiver representations for gauge theories. We find that ev...
May 27, 2024
In this work, we report the results of applying deep learning based on hybrid convolutional-recurrent and purely recurrent neural network architectures to the dataset of almost one million complete intersection Calabi-Yau four-folds (CICY4) to machine-learn their four Hodge numbers $h^{1,1}, h^{2,1}, h^{3,1}, h^{2,2}$. In particular, we explored and experimented with twelve different neural network models, nine of which are convolutional-recurrent (CNN-RNN) hybrids with the R...
March 22, 2023
We survey some recent applications of machine learning to problems in geometry and theoretical physics. Pure mathematical data has been compiled over the last few decades by the community and experiments in supervised, semi-supervised and unsupervised machine learning have found surprising success. We thus advocate the programme of machine learning mathematical structures, and formulating conjectures via pattern recognition, in other words using artificial intelligence to hel...
We describe how simple machine learning methods successfully predict geometric properties from Hilbert series (HS). Regressors predict embedding weights in projective space to ${\sim}1$ mean absolute error, whilst classifiers predict dimension and Gorenstein index to $>90\%$ accuracy with ${\sim}0.5\%$ standard error. Binary random forest classifiers managed to distinguish whether the underlying HS describes a complete intersection with high accuracies exceeding $95\%$. Neura...
November 30, 2021
We use deep neural networks to machine learn correlations between knot invariants in various dimensions. The three-dimensional invariant of interest is the Jones polynomial $J(q)$, and the four-dimensional invariants are the Khovanov polynomial $\text{Kh}(q,t)$, smooth slice genus $g$, and Rasmussen's $s$-invariant. We find that a two-layer feed-forward neural network can predict $s$ from $\text{Kh}(q,-q^{-4})$ with greater than $99\%$ accuracy. A theoretical explanation for ...
November 28, 2023
Calabi-Yau four-folds may be constructed as hypersurfaces in weighted projective spaces of complex dimension 5 defined via weight systems of 6 weights. In this work, neural networks were implemented to learn the Calabi-Yau Hodge numbers from the weight systems, where gradient saliency and symbolic regression then inspired a truncation of the Landau-Ginzburg model formula for the Hodge numbers of any dimensional Calabi-Yau constructed in this way. The approximation always prov...