January 13, 2023
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset. We use fully connected neural networks to model the symmetry transformations and the corresponding generators. We construct loss functions that ensure that the applied transformations are symmetries and that the corresponding set of generators forms a closed (sub)algebra. Our procedure is validated with several examples illustrating diff...
November 13, 2023
Understanding the internal representations learned by neural networks is a cornerstone challenge in the science of machine learning. While there have been significant recent strides in some cases towards understanding how neural networks implement specific target functions, this paper explores a complementary question -- why do networks arrive at particular computational strategies? Our inquiry focuses on the algebraic learning tasks of modular addition, sparse parities, and ...
October 14, 2022
Recent advances in classical machine learning have shown that creating models with inductive biases encoding the symmetries of a problem can greatly improve performance. Importation of these ideas, combined with an existing rich body of work at the nexus of quantum theory and symmetry, has given rise to the field of Geometric Quantum Machine Learning (GQML). Following the success of its classical counterpart, it is reasonable to expect that GQML will play a crucial role in de...
September 9, 2019
This expository article revolves around the question to find short presentations of finite simple groups. This subject is one of the most active research areas of group theory in recent times. We bring together several known results on two-generation and $(2,3)$-generation of finite simple groups and how it impacts computational group theory.
October 28, 2017
It is well known that every finite simple group can be generated by two elements and this leads to a wide range of problems that have been the focus of intensive research in recent years. In this survey article we discuss some of the extraordinary generation properties of simple groups, focussing on topics such as random generation, $(a,b)$-generation and spread, as well as highlighting the application of probabilistic methods in the proofs of many of the main results. We als...
December 2, 2020
One of the central problems in the interface of deep learning and mathematics is that of building learning systems that can automatically uncover underlying mathematical laws from observed data. In this work, we make one step towards building a bridge between algebraic structures and deep learning, and introduce \textbf{AIDN}, \textit{Algebraically-Informed Deep Networks}. \textbf{AIDN} is a deep learning algorithm to represent any finitely-presented algebraic object with a s...
July 23, 2023
We use machine learning to classify examples of braids (or flat braids) as trivial or non-trivial. Our ML takes form of supervised learning using neural networks (multilayer perceptrons). When they achieve good results in classification, we are able to interpret their structure as mathematical conjectures and then prove these conjectures as theorems. As a result, we find new convenient invariants of braids, including a complete invariant of flat braids.
September 14, 2023
Deep learning was recently successfully used in deriving symmetry transformations that preserve important physics quantities. Being completely agnostic, these techniques postpone the identification of the discovered symmetries to a later stage. In this letter we propose methods for examining and identifying the group-theoretic structure of such machine-learned symmetries. We design loss functions which probe the subalgebra structure either during the deep learning stage of sy...
March 28, 2018
We show how well known rules of back propagation arise from a weighted combination of finite automata. By redefining a finite automata as a predictor we combine the set of all $k$-state finite automata using a weighted majority algorithm. This aggregated prediction algorithm can be simplified using symmetry, and we prove the equivalence of an algorithm that does this. We demonstrate that this algorithm is equivalent to a form of a back propagation acting in a completely conne...
November 16, 2018
Symmetry, a central concept in understanding the laws of nature, has been used for centuries in physics, mathematics, and chemistry, to help make mathematical models tractable. Yet, despite its power, symmetry has not been used extensively in machine learning, until rather recently. In this article we show a general way to incorporate symmetries into machine learning models. We demonstrate this with a detailed analysis on a rather simple real world machine learning system - a...