ID: 2108.02221

Deep multi-task mining Calabi-Yau four-folds

August 4, 2021

View on ArXiv

Similar papers 5

The Unreasonable Effectiveness of Deep Learning in Artificial Intelligence

February 12, 2020

84% Match
Terrence J. Sejnowski
Neurons and Cognition
Artificial Intelligence
Machine Learning
Neural and Evolutionary Comp...

Deep learning networks have been trained to recognize speech, caption photographs and translate text between languages at high levels of performance. Although applications of deep learning networks to real world problems have become ubiquitous, our understanding of why they are so effective is lacking. These empirical results should not be possible according to sample complexity in statistics and non-convex optimization theory. However, paradoxes in the training and effective...

Find SimilarView on arXiv

Machine Learning Algebraic Geometry for Physics

April 21, 2022

83% Match
Jiakang Bao, Yang-Hui He, ... , Hirst Edward
Algebraic Geometry
Machine Learning

We review some recent applications of machine learning to algebraic geometry and physics. Since problems in algebraic geometry can typically be reformulated as mappings between tensors, this makes them particularly amenable to supervised learning. Additionally, unsupervised methods can provide insight into the structure of such geometrical data. At the heart of this programme is the question of how geometry can be machine learned, and indeed how AI helps one to do mathematics...

Find SimilarView on arXiv

Learning knot invariants across dimensions

November 30, 2021

83% Match
Jessica Craven, Mark Hughes, ... , Kar Arjun
Machine Learning
Geometric Topology

We use deep neural networks to machine learn correlations between knot invariants in various dimensions. The three-dimensional invariant of interest is the Jones polynomial $J(q)$, and the four-dimensional invariants are the Khovanov polynomial $\text{Kh}(q,t)$, smooth slice genus $g$, and Rasmussen's $s$-invariant. We find that a two-layer feed-forward neural network can predict $s$ from $\text{Kh}(q,-q^{-4})$ with greater than $99\%$ accuracy. A theoretical explanation for ...

Find SimilarView on arXiv

The loss surface of deep linear networks viewed through the algebraic geometry lens

October 17, 2018

83% Match
Dhagash Mehta, Tianran Chen, ... , Hauenstein Jonathan D.
Machine Learning
Machine Learning
Algebraic Geometry

By using the viewpoint of modern computational algebraic geometry, we explore properties of the optimization landscapes of the deep linear neural network models. After clarifying on the various definitions of "flat" minima, we show that the geometrically flat minima, which are merely artifacts of residual continuous symmetries of the deep linear networks, can be straightforwardly removed by a generalized $L_2$ regularization. Then, we establish upper bounds on the number of i...

Find SimilarView on arXiv

Why & When Deep Learning Works: Looking Inside Deep Learnings

May 10, 2017

83% Match
Ronny Ronen
Machine Learning

The Intel Collaborative Research Institute for Computational Intelligence (ICRI-CI) has been heavily supporting Machine Learning and Deep Learning research from its foundation in 2012. We have asked six leading ICRI-CI Deep Learning researchers to address the challenge of "Why & When Deep Learning works", with the goal of looking inside Deep Learning, providing insights on how deep networks function, and uncovering key observations on their expressiveness, limitations, and po...

Find SimilarView on arXiv

Dive into Layers: Neural Network Capacity Bounding using Algebraic Geometry

September 3, 2021

82% Match
Ji Yang, Lu Sang, Daniel Cremers
Machine Learning
Neural and Evolutionary Comp...

The empirical results suggest that the learnability of a neural network is directly related to its size. To mathematically prove this, we borrow a tool in topological algebra: Betti numbers to measure the topological geometric complexity of input data and the neural network. By characterizing the expressive capacity of a neural network with its topological complexity, we conduct a thorough analysis and show that the network's expressive capacity is limited by the scale of its...

Find SimilarView on arXiv

Classifying divisor topologies for string phenomenology

May 11, 2022

82% Match
Pramod Shukla
High Energy Physics - Theory

In this article we present a pheno-inspired classification for the divisor topologies of the favorable Calabi Yau (CY) threefolds with $1 \leq h^{1,1}(CY) \leq 5$ arising from the four-dimensional reflexive polytopes of the Kreuzer-Skarke database. Based on some empirical observations we conjecture that the topologies of the so-called coordinate divisors can be classified into two categories: (i). $\chi_{_h}(D) \geq 1$ with Hodge numbers given by $\{h^{0,0} = 1, \, h^{1,0} = ...

Find SimilarView on arXiv

Illuminating new and known relations between knot invariants

November 2, 2022

82% Match
Jessica Craven, Mark Hughes, ... , Kar Arjun
Geometric Topology

We automate the process of machine learning correlations between knot invariants. For nearly 200,000 distinct sets of input knot invariants together with an output invariant, we attempt to learn the output invariant by training a neural network on the input invariants. Correlation between invariants is measured by the accuracy of the neural network prediction, and bipartite or tripartite correlations are sequentially filtered from the input invariant sets so that experiments ...

Find SimilarView on arXiv

Hodge Numbers for CICYs with Symmetries of Order Divisible by 4

November 3, 2015

82% Match
Philip Candelas, Andrei Constantin, Challenger Mishra
Algebraic Geometry

We compute the Hodge numbers for the quotients of complete intersection Calabi-Yau three-folds by groups of orders divisible by 4. We make use of the polynomial deformation method and the counting of invariant K\"ahler classes. The quotients studied here have been obtained in the automated classification of V. Braun. Although the computer search found the freely acting groups, the Hodge numbers of the quotients were not calculated. The freely acting groups, $G$, that arise in...

Find SimilarView on arXiv

Learning Group Invariant Calabi-Yau Metrics by Fundamental Domain Projections

July 9, 2024

82% Match
Yacoub Hendi, Magdalena Larfors, Moritz Walden
Mathematical Physics

We present new invariant machine learning models that approximate the Ricci-flat metric on Calabi-Yau (CY) manifolds with discrete symmetries. We accomplish this by combining the $\phi$-model of the cymetric package with non-trainable, $G$-invariant, canonicalization layers that project the $\phi$-model's input data (i.e. points sampled from the CY geometry) to the fundamental domain of a given symmetry group $G$. These $G$-invariant layers are easy to concatenate, provided o...

Find SimilarView on arXiv