October 3, 2019
The critical locus of the loss function of a neural network is determined by the geometry of the functional space and by the parameterization of this space by the network's weights. We introduce a natural distinction between pure critical points, which only depend on the functional space, and spurious critical points, which arise from the parameterization. We apply this perspective to revisit and extend the literature on the loss function of linear neural networks. For this t...
September 21, 2022
Generalized Complete Intersection Calabi-Yau Manifold (gCICY) is a new construction of Calabi-Yau manifolds established recently. However, the generation of new gCICYs using standard algebraic method is very laborious. Due to this complexity, the number of gCICYs and their classification still remain unknown. In this paper, we try to make some progress in this direction using neural network. The results showed that our trained models can have a high precision on the existing ...
July 19, 2024
In this work, we perform a comprehensive study of the machine learning (ML) methods for the purpose of characterising the quantum set of correlations. As our main focus is on assessing the usefulness and effectiveness of the ML approach, we focus exclusively on the CHSH scenario, both the 4-dimensional variant, for which an analytical solution is known, and the 8-dimensional variant, for which no analytical solution is known, but numerical approaches are relatively well under...
November 8, 2018
The success of modern Artificial Intelligence (AI) technologies depends critically on the ability to learn non-linear functional dependencies from large, high dimensional data sets. Despite recent high-profile successes, empirical evidence indicates that the high predictive performance is often paired with low robustness, making AI systems potentially vulnerable to adversarial attacks. In this report, we provide a simple intuitive argument suggesting that high performance and...
November 30, 2021
We use deep neural networks to machine learn correlations between knot invariants in various dimensions. The three-dimensional invariant of interest is the Jones polynomial $J(q)$, and the four-dimensional invariants are the Khovanov polynomial $\text{Kh}(q,t)$, smooth slice genus $g$, and Rasmussen's $s$-invariant. We find that a two-layer feed-forward neural network can predict $s$ from $\text{Kh}(q,-q^{-4})$ with greater than $99\%$ accuracy. A theoretical explanation for ...
December 16, 2021
Using a fully connected feedforward neural network we study topological invariants of a class of Calabi--Yau manifolds constructed as hypersurfaces in toric varieties associated with reflexive polytopes from the Kreuzer--Skarke database. In particular, we find the existence of a simple expression for the Euler number that can be learned in terms of limited data extracted from the polytope and its dual.
February 26, 2018
We seek to determine a real algebraic variety from a fixed finite subset of points. Existing methods are studied and new methods are developed. Our focus lies on aspects of topology and algebraic geometry, such as dimension and defining polynomials. All algorithms are tested on a range of datasets and made available in a Julia package.
December 10, 2019
In this paper we describe the birational geometry of Fano double spaces $V\stackrel{\sigma}{\to}{\mathbb P}^{M+1}$ of index 2 and dimension $\geqslant 8$ with at mostquadratic singularities of rank $\geqslant 8$, satisfying certain additional conditions of general position: we prove that these varieties have no structures of a rationally connected fibre space over a base of dimension $\geqslant 2$, that every birational map $\chi\colon V\dashrightarrow V'$ onto the total spac...
November 17, 2011
In this work we provide effective bounds and classification results for rational $\QQ$-factorial Fano varieties with a complexity-one torus action and Picard number one depending on the invariants dimension and Picard index. This complements earlier work by Hausen, S\"u{\ss} and the author, where the case of free divisor class group of rank one was treated.
June 24, 2020
Parameterized systems of polynomial equations arise in many applications in science and engineering with the real solutions describing, for example, equilibria of a dynamical system, linkages satisfying design constraints, and scene reconstruction in computer vision. Since different parameter values can have a different number of real solutions, the parameter space is decomposed into regions whose boundary forms the real discriminant locus. This article views locating the rea...