ID: 2012.04797

Numerical Calabi-Yau metrics from holomorphic networks

December 9, 2020

View on ArXiv

Similar papers 4

Degenerations of Calabi-Yau metrics

October 7, 2010

85% Match
Valentino Tosatti
Differential Geometry
Algebraic Geometry

This is a survey of our recent work on degenerations of Ricci-flat Kahler metrics on compact Calabi-Yau manifolds with Kahler classes approaching the boundary of the Kahler cone.

Find SimilarView on arXiv

Riemannian metrics for neural networks I: feedforward networks

March 4, 2013

85% Match
Yann Ollivier
Neural and Evolutionary Comp...
Information Theory
Machine Learning
Differential Geometry
Information Theory

We describe four algorithms for neural network training, each adapted to different scalability constraints. These algorithms are mathematically principled and invariant under a number of transformations in data and network representation, from which performance is thus independent. These algorithms are obtained from the setting of differential geometry, and are based on either the natural gradient using the Fisher information matrix, or on Hessian methods, scaled down in a sp...

Find SimilarView on arXiv

Calabi-Yau Spaces in the String Landscape

June 30, 2020

85% Match
Yang-Hui He
Algebraic Geometry

Calabi-Yau spaces, or Kahler spaces admitting zero Ricci curvature, have played a pivotal role in theoretical physics and pure mathematics for the last half-century. In physics, they constituted the first and natural solution to compactification of superstring theory to our 4-dimensional universe, primarily due to one of their equivalent definitions being the admittance of covariantly constant spinors. Since the mid-1980s, physicists and mathematicians have joined forces in c...

Find SimilarView on arXiv

Numerical Calabi-Yau metrics

December 11, 2006

85% Match
Michael R. Douglas, Robert L. Karp, ... , Reinbacher Rene
High Energy Physics - Theory

We develop numerical methods for approximating Ricci flat metrics on Calabi-Yau hypersurfaces in projective spaces. Our approach is based on finding balanced metrics, and builds on recent theoretical work by Donaldson. We illustrate our methods in detail for a one parameter family of quintics. We also suggest several ways to extend our results.

Find SimilarView on arXiv

Machine Learning on generalized Complete Intersection Calabi-Yau Manifolds

September 21, 2022

85% Match
Wei Cui, Xin Gao, Juntao Wang
Machine Learning

Generalized Complete Intersection Calabi-Yau Manifold (gCICY) is a new construction of Calabi-Yau manifolds established recently. However, the generation of new gCICYs using standard algebraic method is very laborious. Due to this complexity, the number of gCICYs and their classification still remain unknown. In this paper, we try to make some progress in this direction using neural network. The results showed that our trained models can have a high precision on the existing ...

Find SimilarView on arXiv

On the curvature of the loss landscape

July 10, 2023

84% Match
Alison Pouplin, Hrittik Roy, ... , Arvanitidis Georgios
Machine Learning

One of the main challenges in modern deep learning is to understand why such over-parameterized models perform so well when trained on finite data. A way to analyze this generalization concept is through the properties of the associated loss landscape. In this work, we consider the loss landscape as an embedded Riemannian manifold and show that the differential geometric properties of the manifold can be used when analyzing the generalization abilities of a deep net. In parti...

Find SimilarView on arXiv

A Survey of Geometric Optimization for Deep Learning: From Euclidean Space to Riemannian Manifold

February 16, 2023

84% Match
Yanhong Fei, Xian Wei, Yingjie Liu, ... , Chen Mingsong
Machine Learning

Although Deep Learning (DL) has achieved success in complex Artificial Intelligence (AI) tasks, it suffers from various notorious problems (e.g., feature redundancy, and vanishing or exploding gradients), since updating parameters in Euclidean space cannot fully exploit the geometric structure of the solution space. As a promising alternative solution, Riemannian-based DL uses geometric optimization to update parameters on Riemannian manifolds and can leverage the underlying ...

Find SimilarView on arXiv

TensorFlow RiemOpt: a library for optimization on Riemannian manifolds

May 27, 2021

84% Match
Oleg Smirnov
Mathematical Software
Computational Geometry
Machine Learning

The adoption of neural networks and deep learning in non-Euclidean domains has been hindered until recently by the lack of scalable and efficient learning frameworks. Existing toolboxes in this space were mainly motivated by research and education use cases, whereas practical aspects, such as deploying and maintaining machine learning models, were often overlooked. We attempt to bridge this gap by proposing TensorFlow RiemOpt, a Python library for optimization on Riemannian...

Find SimilarView on arXiv

Calabi-Yau manifolds and their degenerations

September 3, 2011

84% Match
Valentino Tosatti
Differential Geometry

This is a short expository note about Calabi-Yau manifolds and degenerations of their Ricci-flat metrics.

Find SimilarView on arXiv

Dynamically Stable Poincar\'e Embeddings for Neural Manifolds

December 21, 2021

84% Match
Jun Chen, Yuang Liu, Xiangrui Zhao, ... , Liu Yong
Machine Learning
Mathematical Physics

In a Riemannian manifold, the Ricci flow is a partial differential equation for evolving the metric to become more regular. We hope that topological structures from such metrics may be used to assist in the tasks of machine learning. However, this part of the work is still missing. In this paper, we propose Ricci flow assisted Eucl2Hyp2Eucl neural networks that bridge this gap between the Ricci flow and deep neural networks by mapping neural manifolds from the Euclidean space...

Find SimilarView on arXiv