March 24, 2021
Similar papers 2
June 8, 2017
We propose a paradigm to deep-learn the ever-expanding databases which have emerged in mathematical physics and particle phenomenology, as diverse as the statistics of string vacua or combinatorial and algebraic geometry. As concrete examples, we establish multi-layer neural networks as both classifiers and predictors and train them with a host of available data ranging from Calabi-Yau manifolds and vector bundles, to quiver representations for gauge theories. We find that ev...
July 3, 2017
We utilize machine learning to study the string landscape. Deep data dives and conjecture generation are proposed as useful frameworks for utilizing machine learning in the landscape, and examples of each are presented. A decision tree accurately predicts the number of weak Fano toric threefolds arising from reflexive polytopes, each of which determines a smooth F-theory compactification, and linear regression generates a previously proven conjecture for the gauge group rank ...
July 30, 2020
We revisit the question of predicting both Hodge numbers $h^{1,1}$ and $h^{2,1}$ of complete intersection Calabi-Yau (CICY) 3-folds using machine learning (ML), considering both the old and new datasets built respectively by Candelas-Dale-Lutken-Schimmrigk / Green-H\"ubsch-Lutken and by Anderson-Gao-Gray-Lee. In real world applications, implementing a ML system rarely reduces to feed the brute data to the algorithm. Instead, the typical workflow starts with an exploratory dat...
March 7, 2019
Supervised machine learning can be used to predict properties of string geometries with previously unknown features. Using the complete intersection Calabi-Yau (CICY) threefold dataset as a theoretical laboratory for this investigation, we use low $h^{1,1}$ geometries for training and validate on geometries with large $h^{1,1}$. Neural networks and Support Vector Machines successfully predict trends in the number of K\"ahler parameters of CICY threefolds. The numerical accura...
April 19, 2018
We apply machine learning techniques to solve a specific classification problem in 4D F-theory. For a divisor $D$ on a given complex threefold base, we want to read out the non-Higgsable gauge group on it using local geometric information near $D$. The input features are the triple intersection numbers among divisors near $D$ and the output label is the non-Higgsable gauge group. We use decision tree to solve this problem and achieved 85%-98% out-of-sample accuracies for diff...
September 7, 2018
Different techniques from machine learning are applied to the problem of computing line bundle cohomologies of (hypersurfaces in) toric varieties. While a naive approach of training a neural network to reproduce the cohomologies fails in the general case, by inspecting the underlying functional form of the data we propose a second approach. The cohomologies depend in a piecewise polynomial way on the line bundle charges. We use unsupervised learning to separate the different ...
April 17, 2024
Gaussian Process Regression, Kernel Support Vector Regression, the random forest, extreme gradient boosting and the generalized linear model algorithms are applied to data of Complete Intersection Calabi-Yau 3-folds. It is shown that Gaussian process regression is the most suitable for learning the Hodge number h^(2,1)in terms of h^(1,1). The performance of this regression algorithm is such that the Pearson correlation coefficient for the validation set is R^2 = 0.9999999995 ...
January 5, 2022
On the long-established classification problems in general relativity we take a novel perspective by adopting fruitful techniques from machine learning and modern data-science. In particular, we model Petrov's classification of spacetimes, and show that a feed-forward neural network can achieve high degree of success. We also show how data visualization techniques with dimensionality reduction can help analyze the underlying patterns in the structure of the different types of...
April 27, 2022
We present a statistical approach for the discovery of relationships between mathematical entities that is based on linear regression and deep learning with fully connected artificial neural networks. The strategy is applied to computational knot data and empirical connections between combinatorial and hyperbolic knot invariants are revealed.
November 8, 2018
The success of modern Artificial Intelligence (AI) technologies depends critically on the ability to learn non-linear functional dependencies from large, high dimensional data sets. Despite recent high-profile successes, empirical evidence indicates that the high predictive performance is often paired with low robustness, making AI systems potentially vulnerable to adversarial attacks. In this report, we provide a simple intuitive argument suggesting that high performance and...