ID: 2308.11355

Machine learning assisted exploration for affine Deligne-Lusztig varieties

August 22, 2023

View on ArXiv

Similar papers 2

On affine Lusztig varieties

February 7, 2023

84% Match
Xuhua He
Representation Theory
Algebraic Geometry
Number Theory

Affine Lusztig varieties encode the orbital integrals of Iwahori-Hecke functions and serve as building blocks for the (conjectural) theory of affine character sheaves. In this paper, we establish a close relationship between affine Lusztig varieties and affine Deligne-Lusztig varieties. Consequently, we give an explicit nonemptiness pattern and dimension formula for affine Lusztig varieties in most cases.

Find SimilarView on arXiv

A machine learning based software pipeline to pick the variable ordering for algorithms with polynomial inputs

May 22, 2020

84% Match
Dorian Florescu, Matthew England
Symbolic Computation
Machine Learning

We are interested in the application of Machine Learning (ML) technology to improve mathematical software. It may seem that the probabilistic nature of ML tools would invalidate the exact results prized by such software, however, the algorithms which underpin the software often come with a range of choices which are good candidates for ML application. We refer to choices which have no effect on the mathematical correctness of the software, but do impact its performance. In ...

Find SimilarView on arXiv

Machine learning the dimension of a Fano variety

September 11, 2023

84% Match
Tom Coates, Alexander M. Kasprzyk, Sara Veneziale
Algebraic Geometry
Machine Learning

Fano varieties are basic building blocks in geometry - they are `atomic pieces' of mathematical shapes. Recent progress in the classification of Fano varieties involves analysing an invariant called the quantum period. This is a sequence of integers which gives a numerical fingerprint for a Fano variety. It is conjectured that a Fano variety is uniquely determined by its quantum period. If this is true, one should be able to recover geometric properties of a Fano variety dire...

Find SimilarView on arXiv

Machine Learning meets Number Theory: The Data Science of Birch-Swinnerton-Dyer

November 4, 2019

84% Match
Laura Alessandretti, Andrea Baronchelli, Yang-Hui He
Number Theory
Machine Learning
Machine Learning

Empirical analysis is often the first step towards the birth of a conjecture. This is the case of the Birch-Swinnerton-Dyer (BSD) Conjecture describing the rational points on an elliptic curve, one of the most celebrated unsolved problems in mathematics. Here we extend the original empirical approach, to the analysis of the Cremona database of quantities relevant to BSD, inspecting more than 2.5 million elliptic curves by means of the latest techniques in data science, machin...

Find SimilarView on arXiv

Gradient Boosts the Approximate Vanishing Ideal

November 11, 2019

84% Match
Hiroshi Kera, Yoshihiko Hasegawa
Machine Learning
Machine Learning

In the last decade, the approximate vanishing ideal and its basis construction algorithms have been extensively studied in computer algebra and machine learning as a general model to reconstruct the algebraic variety on which noisy data approximately lie. In particular, the basis construction algorithms developed in machine learning are widely used in applications across many fields because of their monomial-order-free property; however, they lose many of the theoretical prop...

Find SimilarView on arXiv

Mining the Minoria: Unknown, Under-represented, and Under-performing Minority Groups

November 7, 2024

83% Match
Mohsen Dehghankar, Abolfazl Asudeh
Machine Learning

Due to a variety of reasons, such as privacy, data in the wild often misses the grouping information required for identifying minorities. On the other hand, it is known that machine learning models are only as good as the data they are trained on and, hence, may underperform for the under-represented minority groups. The missing grouping information presents a dilemma for responsible data scientists who find themselves in an unknown-unknown situation, where not only do they n...

Find SimilarView on arXiv

Using Machine Learning to Decide When to Precondition Cylindrical Algebraic Decomposition With Groebner Bases

August 15, 2016

83% Match
Zongyan Huang, Matthew England, ... , Paulson Lawrence C.
Symbolic Computation
Machine Learning

Cylindrical Algebraic Decomposition (CAD) is a key tool in computational algebraic geometry, particularly for quantifier elimination over real-closed fields. However, it can be expensive, with worst case complexity doubly exponential in the size of the input. Hence it is important to formulate the problem in the best manner for the CAD algorithm. One possibility is to precondition the input polynomials using Groebner Basis (GB) theory. Previous experiments have shown that whi...

Find SimilarView on arXiv

Machine Learning and Theory Ladenness -- A Phenomenological Account

September 17, 2024

83% Match
Alberto Termine, Emanuele Ratti, Alessandro Facchini
Artificial Intelligence

In recent years, the dissemination of machine learning (ML) methodologies in scientific research has prompted discussions on theory ladenness. More specifically, the issue of theory ladenness has remerged as questions about whether and how ML models (MLMs) and ML modelling strategies are impacted by the domain theory of the scientific field in which ML is used and implemented (e.g., physics, chemistry, biology, etc). On the one hand, some have argued that there is no differen...

Find SimilarView on arXiv

Using Machine Learning to Improve Cylindrical Algebraic Decomposition

April 26, 2018

83% Match
Zongyan Huang, Matthew England, David Wilson, ... , Paulson Lawrence C.
Symbolic Computation
Machine Learning

Cylindrical Algebraic Decomposition (CAD) is a key tool in computational algebraic geometry, best known as a procedure to enable Quantifier Elimination over real-closed fields. However, it has a worst case complexity doubly exponential in the size of the input, which is often encountered in practice. It has been observed that for many problems a change in algorithm settings or problem formulation can cause huge differences in runtime costs, changing problem instances from int...

Find SimilarView on arXiv

Intrinsic Geometric Vulnerability of High-Dimensional Artificial Intelligence

November 8, 2018

83% Match
Luca Bortolussi, Guido Sanguinetti
Machine Learning
Machine Learning

The success of modern Artificial Intelligence (AI) technologies depends critically on the ability to learn non-linear functional dependencies from large, high dimensional data sets. Despite recent high-profile successes, empirical evidence indicates that the high predictive performance is often paired with low robustness, making AI systems potentially vulnerable to adversarial attacks. In this report, we provide a simple intuitive argument suggesting that high performance and...

Find SimilarView on arXiv