ID: 1905.02263

Learning Algebraic Structures: Preliminary Investigations

May 2, 2019

View on ArXiv

Similar papers 3

Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks

December 13, 2023

84% Match
Giovanni Luca Marchetti, Christopher Hillar, ... , Sanborn Sophia
Machine Learning
Artificial Intelligence
Signal Processing

In this work, we formally prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group. This provides a mathematical explanation for the emergence of Fourier features -- a ubiquitous phenomenon in both biological and artificial learning systems. The results hold even for non-commutative groups, in which case the Fourier transform encodes all the irreducible unitary group representations. ...

Find SimilarView on arXiv

Minsky machines and algorithmic problems

April 29, 2015

84% Match
Mark Sapir
Group Theory

This is a survey of using Minsky machines to study algorithmic problems in semigroups, groups and other algebraic systems.

Find SimilarView on arXiv

Algorithms for groups

June 15, 1994

83% Match
John Cannon, George Havas
Group Theory

Group theory is a particularly fertile field for the design of practical algorithms. Algorithms have been developed across the various branches of the subject and they find wide application. Because of its relative maturity, computational group theory may be used to gain insight into the general structure of algebraic algorithms. This paper examines the basic ideas behind some of the more important algorithms for finitely presented groups and permutation groups, and surveys r...

Find SimilarView on arXiv

Neural Discovery of Permutation Subgroups

September 11, 2023

83% Match
Pavan Karjol, Rohan Kashyap, Prathosh A P
Machine Learning

We consider the problem of discovering subgroup $H$ of permutation group $S_{n}$. Unlike the traditional $H$-invariant networks wherein $H$ is assumed to be known, we present a method to discover the underlying subgroup, given that it satisfies certain conditions. Our results show that one could discover any subgroup of type $S_{k} (k \leq n)$ by learning an $S_{n}$-invariant function and a linear transformation. We also prove similar results for cyclic and dihedral subgroups...

Find SimilarView on arXiv

Towards a Mathematical Understanding of Neural Network-Based Machine Learning: what we know and what we don't

September 22, 2020

83% Match
Weinan E, Chao Ma, ... , Wu Lei
Machine Learning
Numerical Analysis
Numerical Analysis
Machine Learning

The purpose of this article is to review the achievements made in the last few years towards the understanding of the reasons behind the success and subtleties of neural network-based machine learning. In the tradition of good old applied mathematics, we will not only give attention to rigorous mathematical results, but also the insight we have gained from careful numerical experiments as well as the analysis of simplified models. Along the way, we also list the open problems...

Find SimilarView on arXiv

Algebraic Machine Learning with an Application to Chemistry

May 11, 2022

83% Match
Ezzeddine El Sai, Parker Gara, Markus J. Pflaum
Algebraic Geometry
Computational Geometry
Machine Learning
Mathematical Physics

As datasets used in scientific applications become more complex, studying the geometry and topology of data has become an increasingly prevalent part of the data analysis process. This can be seen for example with the growing interest in topological tools such as persistent homology. However, on the one hand, topological tools are inherently limited to providing only coarse information about the underlying space of the data. On the other hand, more geometric approaches rely p...

Find SimilarView on arXiv

What do AI algorithms actually learn? - On false structures in deep learning

June 4, 2019

83% Match
Laura Thesing, Vegard Antun, Anders C. Hansen
Machine Learning
Cryptography and Security
Computer Vision and Pattern ...
Machine Learning

There are two big unsolved mathematical questions in artificial intelligence (AI): (1) Why is deep learning so successful in classification problems and (2) why are neural nets based on deep learning at the same time universally unstable, where the instabilities make the networks vulnerable to adversarial attacks. We present a solution to these questions that can be summed up in two words; false structures. Indeed, deep learning does not learn the original structures that hum...

Find SimilarView on arXiv

Applications of AI for Magic Squares

February 3, 2016

83% Match
Jared Weed
History and Overview

In recreational mathematics, a normal magic square is an $n \times n$ square matrix whose entries are distinctly the integers $1 \ldots n^2$, such that each row, column, and major and minor traces sum to one constant $\mu$. It has been proven that there are 7,040 fourth order normal magic squares and 2,202,441,792 fifth order normal magic squares, with higher orders unconfirmed. Previous work related to fourth order normal squares has shown that symmetries such as the dihedra...

Find SimilarView on arXiv

Linear Space Data Structures for Finite Groups with Constant Query-time

March 3, 2023

83% Match
Bireswar Das, Anant Kumar, ... , Thakkar Dhara
Data Structures and Algorith...
Discrete Mathematics
Combinatorics
Group Theory

A finite group of order $n$ can be represented by its Cayley table. In the word-RAM model the Cayley table of a group of order $n$ can be stored using $O(n^2)$ words and can be used to answer a multiplication query in constant time. It is interesting to ask if we can design a data structure to store a group of order $n$ that uses $o(n^2)$ space but can still answer a multiplication query in constant time. We design a constant query-time data structure that can store any fin...

Find SimilarView on arXiv

Machine-Learning Kronecker Coefficients

June 7, 2023

83% Match
Kyu-Hwan Lee
Representation Theory
Combinatorics
Machine Learning

The Kronecker coefficients are the decomposition multiplicities of the tensor product of two irreducible representations of the symmetric group. Unlike the Littlewood--Richardson coefficients, which are the analogues for the general linear group, there is no known combinatorial description of the Kronecker coefficients, and it is an NP-hard problem to decide whether a given Kronecker coefficient is zero or not. In this paper, we show that standard machine-learning algorithms ...

Find SimilarView on arXiv