ID: 2310.00041

Machine Learning Clifford invariants of ADE Coxeter elements

September 29, 2023

View on ArXiv

Similar papers 2

Rank-3 root systems induce root systems of rank 4 via a new Clifford spinor construction

July 31, 2012

84% Match
Pierre-Philippe Dechant
Mathematical Physics

In this paper, we show that via a novel construction every rank-3 root system induces a root system of rank 4. Via the Cartan-Dieudonn\'e theorem, an even number of successive Coxeter reflections yields rotations that in a Clifford algebra framework are described by spinors. In three dimensions these spinors themselves have a natural four-dimensional Euclidean structure, and discrete spinor groups can therefore be interpreted as 4D polytopes. In fact, we show that these polyt...

Find SimilarView on arXiv

Machine learning and invariant theory

September 29, 2022

84% Match
Ben Blum-Smith, Soledad Villar
Machine Learning
Machine Learning

Inspired by constraints from physical law, equivariant machine learning restricts the learning to a hypothesis class where all the functions are equivariant with respect to some group action. Irreducible representations or invariant theory are typically used to parameterize the space of such functions. In this article, we introduce the topic and explain a couple of methods to explicitly parameterize equivariant functions that are being used in machine learning applications. I...

Find SimilarView on arXiv

Machine-Learning Mathematical Structures

January 15, 2021

84% Match
Yang-Hui He
Machine Learning
History and Overview
History and Philosophy of Ph...

We review, for a general audience, a variety of recent experiments on extracting structure from machine-learning mathematical data that have been compiled over the years. Focusing on supervised machine-learning on labeled data from different fields ranging from geometry to representation theory, from combinatorics to number theory, we present a comparative study of the accuracies on different problems. The paradigm should be useful for conjecture formulation, finding more eff...

Find SimilarView on arXiv

Representation Theory for Geometric Quantum Machine Learning

October 14, 2022

84% Match
Michael Ragone, Paolo Braccia, Quynh T. Nguyen, Louis Schatzki, Patrick J. Coles, Frederic Sauvage, ... , Cerezo M.
Machine Learning
Representation Theory
Machine Learning

Recent advances in classical machine learning have shown that creating models with inductive biases encoding the symmetries of a problem can greatly improve performance. Importation of these ideas, combined with an existing rich body of work at the nexus of quantum theory and symmetry, has given rise to the field of Geometric Quantum Machine Learning (GQML). Following the success of its classical counterpart, it is reasonable to expect that GQML will play a crucial role in de...

Find SimilarView on arXiv

Machine Learning in Physics and Geometry

March 22, 2023

84% Match
Yang-Hui He, Elli Heyes, Edward Hirst
Algebraic Geometry
Mathematical Physics

We survey some recent applications of machine learning to problems in geometry and theoretical physics. Pure mathematical data has been compiled over the last few decades by the community and experiments in supervised, semi-supervised and unsupervised machine learning have found surprising success. We thus advocate the programme of machine learning mathematical structures, and formulating conjectures via pattern recognition, in other words using artificial intelligence to hel...

Find SimilarView on arXiv

Learning to be Simple

December 8, 2023

83% Match
Yang-Hui He, Vishnu Jejjala, ... , Sharnoff Max
Machine Learning
Group Theory
Mathematical Physics

In this work we employ machine learning to understand structured mathematical data involving finite groups and derive a theorem about necessary properties of generators of finite simple groups. We create a database of all 2-generated subgroups of the symmetric group on n-objects and conduct a classification of finite simple groups among them using shallow feed-forward neural networks. We show that this neural network classifier can decipher the property of simplicity with var...

A New Neural Network Architecture Invariant to the Action of Symmetry Subgroups

December 11, 2020

83% Match
Piotr Kicki, Mete Ozay, Piotr Skrzypczyński
Machine Learning
Artificial Intelligence

We propose a computationally efficient $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup $G \leq S_n$ of the symmetric group on input data. The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data. Theoretical considerations are supported by numerical experiments, which demonstrate the effectiveness and...

Find SimilarView on arXiv
Man-Wai Cheung, Pierre-Philippe Dechant, Yang-Hui He, Elli Heyes, ... , Li Jian-Rong
Combinatorics

Classification of cluster variables in cluster algebras (in particular, Grassmannian cluster algebras) is an important problem, which has direct application to computations of scattering amplitudes in physics. In this paper, we apply the tableaux method to classify cluster variables in Grassmannian cluster algebras $\mathbb{C}[Gr(k,n)]$ up to $(k,n)=(3,12), (4,10)$, or $(4,12)$ up to a certain number of columns of tableaux, using HPC clusters. These datasets are made availabl...

Clifford Algebra of the Vector Space of Conics for decision boundary Hyperplanes in m-Euclidean Space

July 26, 2007

83% Match
Isidro B. Nieto, J. Refugio Vallejo
Neural and Evolutionary Comp...
Computational Geometry

In this paper we embed $m$-dimensional Euclidean space in the geometric algebra $Cl_m $ to extend the operators of incidence in ${R^m}$ to operators of incidence in the geometric algebra to generalize the notion of separator to a decision boundary hyperconic in the Clifford algebra of hyperconic sections denoted as ${Cl}({Co}_{2})$. This allows us to extend the concept of a linear perceptron or the spherical perceptron in conformal geometry and introduce the more general coni...

Find SimilarView on arXiv

A Computationally Efficient Neural Network Invariant to the Action of Symmetry Subgroups

February 18, 2020

83% Match
Piotr Kicki, Mete Ozay, Piotr Skrzypczyński
Machine Learning
Neural and Evolutionary Comp...
Machine Learning

We introduce a method to design a computationally efficient $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup $G \leq S_n$ of the symmetric group on input data. The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data. This latent representation is then processed with a multi-layer perceptron in the net...

Find SimilarView on arXiv