ID: 2309.05352

Neural Discovery of Permutation Subgroups

September 11, 2023

View on ArXiv
Pavan Karjol, Rohan Kashyap, Prathosh A P
Computer Science
Machine Learning

We consider the problem of discovering subgroup $H$ of permutation group $S_{n}$. Unlike the traditional $H$-invariant networks wherein $H$ is assumed to be known, we present a method to discover the underlying subgroup, given that it satisfies certain conditions. Our results show that one could discover any subgroup of type $S_{k} (k \leq n)$ by learning an $S_{n}$-invariant function and a linear transformation. We also prove similar results for cyclic and dihedral subgroups. Finally, we provide a general theorem that can be extended to discover other subgroups of $S_{n}$. We also demonstrate the applicability of our results through numerical experiments on image-digit sum and symmetric polynomial regression tasks.

Similar papers 1

A New Neural Network Architecture Invariant to the Action of Symmetry Subgroups

December 11, 2020

92% Match
Piotr Kicki, Mete Ozay, Piotr Skrzypczyński
Machine Learning
Artificial Intelligence

We propose a computationally efficient $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup $G \leq S_n$ of the symmetric group on input data. The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data. Theoretical considerations are supported by numerical experiments, which demonstrate the effectiveness and...

Find SimilarView on arXiv

A Computationally Efficient Neural Network Invariant to the Action of Symmetry Subgroups

February 18, 2020

91% Match
Piotr Kicki, Mete Ozay, Piotr Skrzypczyński
Machine Learning
Neural and Evolutionary Comp...
Machine Learning

We introduce a method to design a computationally efficient $G$-invariant neural network that approximates functions invariant to the action of a given permutation subgroup $G \leq S_n$ of the symmetric group on input data. The key element of the proposed network architecture is a new $G$-invariant transformation module, which produces a $G$-invariant latent representation of the input data. This latent representation is then processed with a multi-layer perceptron in the net...

Find SimilarView on arXiv

A Unified Framework for Discovering Discrete Symmetries

September 6, 2023

90% Match
Pavan Karjol, Rohan Kashyap, ... , P Prathosh A.
Machine Learning
Computer Vision and Pattern ...

We consider the problem of learning a function respecting a symmetry from among a class of symmetries. We develop a unified framework that enables symmetry discovery across a broad range of subgroups including locally symmetric, dihedral and cyclic subgroups. At the core of the framework is a novel architecture composed of linear, matrix-valued and non-linear functions that expresses functions invariant to these subgroups in a principled manner. The structure of the architect...

Find SimilarView on arXiv

Universal approximations of permutation invariant/equivariant functions by deep neural networks

March 5, 2019

90% Match
Akiyoshi Sannai, Yuuki Takai, Matthieu Cordonnier
Machine Learning
Machine Learning

In this paper, we develop a theory about the relationship between $G$-invariant/equivariant functions and deep neural networks for finite group $G$. Especially, for a given $G$-invariant/equivariant function, we construct its universal approximator by deep neural network whose layers equip $G$-actions and each affine transformations are $G$-equivariant/invariant. Due to representation theory, we can show that this approximator has exponentially fewer free parameters than usua...

Find SimilarView on arXiv

Geometry of Linear Neural Networks: Equivariance and Invariance under Permutation Groups

September 24, 2023

89% Match
Kathlén Kohn, Anna-Laura Sattelberger, Vahid Shahverdi
Machine Learning
Algebraic Geometry

The set of functions parameterized by a linear fully-connected neural network is a determinantal variety. We investigate the subvariety of functions that are equivariant or invariant under the action of a permutation group. Examples of such group actions are translations or $90^\circ$ rotations on images. We describe such equivariant or invariant subvarieties as direct products of determinantal varieties, from which we deduce their dimension, degree, Euclidean distance degree...

Find SimilarView on arXiv

Grokking Group Multiplication with Cosets

December 11, 2023

89% Match
Dashiell Stander, Qinan Yu, ... , Biderman Stella
Machine Learning
Artificial Intelligence
Representation Theory

We use the group Fourier transform over the symmetric group $S_n$ to reverse engineer a 1-layer feedforward network that has "grokked" the multiplication of $S_5$ and $S_6$. Each model discovers the true subgroup structure of the full group and converges on circuits that decompose the group multiplication into the multiplication of the group's conjugate subgroups. We demonstrate the value of using the symmetries of the data and models to understand their mechanisms and hold u...

Find SimilarView on arXiv

Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks

December 13, 2023

89% Match
Giovanni Luca Marchetti, Christopher Hillar, ... , Sanborn Sophia
Machine Learning
Artificial Intelligence
Signal Processing

In this work, we formally prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group. This provides a mathematical explanation for the emergence of Fourier features -- a ubiquitous phenomenon in both biological and artificial learning systems. The results hold even for non-commutative groups, in which case the Fourier transform encodes all the irreducible unitary group representations. ...

Find SimilarView on arXiv

On the Universality of Invariant Networks

January 27, 2019

88% Match
Haggai Maron, Ethan Fetaya, ... , Lipman Yaron
Machine Learning
Machine Learning

Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts ...

Find SimilarView on arXiv

On the hardness of learning under symmetries

January 3, 2024

88% Match
Bobak T. Kiani, Thien Le, Hannah Lawrence, ... , Weber Melanie
Machine Learning
Data Structures and Algorith...
Statistics Theory
Machine Learning
Statistics Theory

We study the problem of learning equivariant neural networks via gradient descent. The incorporation of known symmetries ("equivariance") into neural nets has empirically improved the performance of learning pipelines, in domains ranging from biology to computer vision. However, a rich yet separate line of learning theoretic research has demonstrated that actually learning shallow, fully-connected (i.e. non-symmetric) networks has exponential complexity in the correlational s...

Find SimilarView on arXiv

Universal Equivariant Multilayer Perceptrons

February 7, 2020

88% Match
Siamak Ravanbakhsh
Machine Learning
Neural and Evolutionary Comp...
Group Theory
Machine Learning

Group invariant and equivariant Multilayer Perceptrons (MLP), also known as Equivariant Networks, have achieved remarkable success in learning on a variety of data structures, such as sequences, images, sets, and graphs. Using tools from group theory, this paper proves the universality of a broad class of equivariant MLPs with a single hidden layer. In particular, it is shown that having a hidden layer on which the group acts regularly is sufficient for universal equivariance...

Find SimilarView on arXiv