May 2, 2019
Similar papers 3
December 13, 2023
In this work, we formally prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group. This provides a mathematical explanation for the emergence of Fourier features -- a ubiquitous phenomenon in both biological and artificial learning systems. The results hold even for non-commutative groups, in which case the Fourier transform encodes all the irreducible unitary group representations. ...
April 29, 2015
This is a survey of using Minsky machines to study algorithmic problems in semigroups, groups and other algebraic systems.
June 15, 1994
Group theory is a particularly fertile field for the design of practical algorithms. Algorithms have been developed across the various branches of the subject and they find wide application. Because of its relative maturity, computational group theory may be used to gain insight into the general structure of algebraic algorithms. This paper examines the basic ideas behind some of the more important algorithms for finitely presented groups and permutation groups, and surveys r...
September 11, 2023
We consider the problem of discovering subgroup $H$ of permutation group $S_{n}$. Unlike the traditional $H$-invariant networks wherein $H$ is assumed to be known, we present a method to discover the underlying subgroup, given that it satisfies certain conditions. Our results show that one could discover any subgroup of type $S_{k} (k \leq n)$ by learning an $S_{n}$-invariant function and a linear transformation. We also prove similar results for cyclic and dihedral subgroups...
September 22, 2020
The purpose of this article is to review the achievements made in the last few years towards the understanding of the reasons behind the success and subtleties of neural network-based machine learning. In the tradition of good old applied mathematics, we will not only give attention to rigorous mathematical results, but also the insight we have gained from careful numerical experiments as well as the analysis of simplified models. Along the way, we also list the open problems...
May 11, 2022
As datasets used in scientific applications become more complex, studying the geometry and topology of data has become an increasingly prevalent part of the data analysis process. This can be seen for example with the growing interest in topological tools such as persistent homology. However, on the one hand, topological tools are inherently limited to providing only coarse information about the underlying space of the data. On the other hand, more geometric approaches rely p...
June 4, 2019
There are two big unsolved mathematical questions in artificial intelligence (AI): (1) Why is deep learning so successful in classification problems and (2) why are neural nets based on deep learning at the same time universally unstable, where the instabilities make the networks vulnerable to adversarial attacks. We present a solution to these questions that can be summed up in two words; false structures. Indeed, deep learning does not learn the original structures that hum...
February 3, 2016
In recreational mathematics, a normal magic square is an $n \times n$ square matrix whose entries are distinctly the integers $1 \ldots n^2$, such that each row, column, and major and minor traces sum to one constant $\mu$. It has been proven that there are 7,040 fourth order normal magic squares and 2,202,441,792 fifth order normal magic squares, with higher orders unconfirmed. Previous work related to fourth order normal squares has shown that symmetries such as the dihedra...
March 3, 2023
A finite group of order $n$ can be represented by its Cayley table. In the word-RAM model the Cayley table of a group of order $n$ can be stored using $O(n^2)$ words and can be used to answer a multiplication query in constant time. It is interesting to ask if we can design a data structure to store a group of order $n$ that uses $o(n^2)$ space but can still answer a multiplication query in constant time. We design a constant query-time data structure that can store any fin...
June 7, 2023
The Kronecker coefficients are the decomposition multiplicities of the tensor product of two irreducible representations of the symmetric group. Unlike the Littlewood--Richardson coefficients, which are the analogues for the general linear group, there is no known combinatorial description of the Kronecker coefficients, and it is an NP-hard problem to decide whether a given Kronecker coefficient is zero or not. In this paper, we show that standard machine-learning algorithms ...