September 29, 2023
Similar papers 2
July 31, 2012
In this paper, we show that via a novel construction every rank-3 root system induces a root system of rank 4. Via the Cartan-Dieudonn\'e theorem, an even number of successive Coxeter reflections yields rotations that in a Clifford algebra framework are described by spinors. In three dimensions these spinors themselves have a natural four-dimensional Euclidean structure, and discrete spinor groups can therefore be interpreted as 4D polytopes. In fact, we show that these polyt...
September 29, 2022
Inspired by constraints from physical law, equivariant machine learning restricts the learning to a hypothesis class where all the functions are equivariant with respect to some group action. Irreducible representations or invariant theory are typically used to parameterize the space of such functions. In this article, we introduce the topic and explain a couple of methods to explicitly parameterize equivariant functions that are being used in machine learning applications. I...
January 15, 2021
We review, for a general audience, a variety of recent experiments on extracting structure from machine-learning mathematical data that have been compiled over the years. Focusing on supervised machine-learning on labeled data from different fields ranging from geometry to representation theory, from combinatorics to number theory, we present a comparative study of the accuracies on different problems. The paradigm should be useful for conjecture formulation, finding more eff...
October 14, 2022
Recent advances in classical machine learning have shown that creating models with inductive biases encoding the symmetries of a problem can greatly improve performance. Importation of these ideas, combined with an existing rich body of work at the nexus of quantum theory and symmetry, has given rise to the field of Geometric Quantum Machine Learning (GQML). Following the success of its classical counterpart, it is reasonable to expect that GQML will play a crucial role in de...
March 22, 2023
We survey some recent applications of machine learning to problems in geometry and theoretical physics. Pure mathematical data has been compiled over the last few decades by the community and experiments in supervised, semi-supervised and unsupervised machine learning have found surprising success. We thus advocate the programme of machine learning mathematical structures, and formulating conjectures via pattern recognition, in other words using artificial intelligence to hel...
December 8, 2023
In this work we employ machine learning to understand structured mathematical data involving finite groups and derive a theorem about necessary properties of generators of finite simple groups. We create a database of all 2-generated subgroups of the symmetric group on n-objects and conduct a classification of finite simple groups among them using shallow feed-forward neural networks. We show that this neural network classifier can decipher the property of simplicity with var...
December 11, 2020
We propose a computationally efficient $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup $G \leq S_n$ of the symmetric group on input data. The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data. Theoretical considerations are supported by numerical experiments, which demonstrate the effectiveness and...
December 19, 2022
Classification of cluster variables in cluster algebras (in particular, Grassmannian cluster algebras) is an important problem, which has direct application to computations of scattering amplitudes in physics. In this paper, we apply the tableaux method to classify cluster variables in Grassmannian cluster algebras $\mathbb{C}[Gr(k,n)]$ up to $(k,n)=(3,12), (4,10)$, or $(4,12)$ up to a certain number of columns of tableaux, using HPC clusters. These datasets are made availabl...
July 26, 2007
In this paper we embed $m$-dimensional Euclidean space in the geometric algebra $Cl_m $ to extend the operators of incidence in ${R^m}$ to operators of incidence in the geometric algebra to generalize the notion of separator to a decision boundary hyperconic in the Clifford algebra of hyperconic sections denoted as ${Cl}({Co}_{2})$. This allows us to extend the concept of a linear perceptron or the spherical perceptron in conformal geometry and introduce the more general coni...
February 18, 2020
We introduce a method to design a computationally efficient $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup $G \leq S_n$ of the symmetric group on input data. The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data. This latent representation is then processed with a multi-layer perceptron in the net...