January 21, 2022
We review recent efforts to machine learn relations between knot invariants. Because these knot invariants have meaning in physics, we explore aspects of Chern-Simons theory and higher dimensional gauge theories. The goal of this work is to translate numerical experiments with Big Data to new analytic results.
January 5, 2020
In these lecture notes, we survey the landscape of Calabi-Yau threefolds, and the use of machine learning to explore it. We begin with the compact portion of the landscape, focusing in particular on complete intersection Calabi-Yau varieties (CICYs) and elliptic fibrations. Non-compact Calabi-Yau manifolds are manifest in Type II superstring theories, they arise as representation varieties of quivers, used to describe gauge theories in the bulk familiar four dimensions. Final...
March 7, 2019
Supervised machine learning can be used to predict properties of string geometries with previously unknown features. Using the complete intersection Calabi-Yau (CICY) threefold dataset as a theoretical laboratory for this investigation, we use low $h^{1,1}$ geometries for training and validate on geometries with large $h^{1,1}$. Neural networks and Support Vector Machines successfully predict trends in the number of K\"ahler parameters of CICY threefolds. The numerical accura...
April 21, 2022
We review some recent applications of machine learning to algebraic geometry and physics. Since problems in algebraic geometry can typically be reformulated as mappings between tensors, this makes them particularly amenable to supervised learning. Additionally, unsupervised methods can provide insight into the structure of such geometrical data. At the heart of this programme is the question of how geometry can be machine learned, and indeed how AI helps one to do mathematics...
November 2, 2022
We automate the process of machine learning correlations between knot invariants. For nearly 200,000 distinct sets of input knot invariants together with an output invariant, we attempt to learn the output invariant by training a neural network on the input invariants. Correlation between invariants is measured by the accuracy of the neural network prediction, and bipartite or tripartite correlations are sequentially filtered from the input invariant sets so that experiments ...
October 18, 2019
We apply machine learning to the problem of finding numerical Calabi-Yau metrics. Building on Donaldson's algorithm for calculating balanced metrics on K\"ahler manifolds, we combine conventional curve fitting and machine-learning techniques to numerically approximate Ricci-flat metrics. We show that machine learning is able to predict the Calabi-Yau metric and quantities associated with it, such as its determinant, having seen only a small sample of training data. Using this...
December 8, 2020
We use machine learning to approximate Calabi-Yau and SU(3)-structure metrics, including for the first time complex structure moduli dependence. Our new methods furthermore improve existing numerical approximations in terms of accuracy and speed. Knowing these metrics has numerous applications, ranging from computations of crucial aspects of the effective field theory of string compactifications such as the canonical normalizations for Yukawa couplings, and the massive string...
June 11, 2017
We employ machine learning techniques to investigate the volume minimum of Sasaki-Einstein base manifolds of non-compact toric Calabi-Yau 3-folds. We find that the minimum volume can be approximated via a second order multiple linear regression on standard topological quantities obtained from the corresponding toric diagram. The approximation improves further after invoking a convolutional neural network with the full toric diagram of the Calabi-Yau 3-folds as the input. We a...
December 12, 2021
We revisit the classic database of weighted-P4s which admit Calabi-Yau 3-fold hypersurfaces equipped with a diverse set of tools from the machine-learning toolbox. Unsupervised techniques identify an unanticipated almost linear dependence of the topological data on the weights. This then allows us to identify a previously unnoticed clustering in the Calabi-Yau data. Supervised techniques are successful in predicting the topological parameters of the hypersurface from its weig...
December 7, 2020
We present a simple phenomenological formula which approximates the hyperbolic volume of a knot using only a single evaluation of its Jones polynomial at a root of unity. The average error is just $2.86$% on the first $1.7$ million knots, which represents a large improvement over previous formulas of this kind. To find the approximation formula, we use layer-wise relevance propagation to reverse engineer a black box neural network which achieves a similar average error for th...