ID: 2306.04734

Machine-Learning Kronecker Coefficients

June 7, 2023

View on ArXiv

Similar papers 3

Learning Algebraic Structures: Preliminary Investigations

May 2, 2019

83% Match
Yang-Hui He, Minhyong Kim
Machine Learning
Group Theory
Rings and Algebras
Machine Learning

We employ techniques of machine-learning, exemplified by support vector machines and neural classifiers, to initiate the study of whether AI can "learn" algebraic structures. Using finite groups and finite rings as a concrete playground, we find that questions such as identification of simple groups by "looking" at the Cayley table or correctly matching addition and multiplication tables for finite rings can, at least for structures of small size, be performed by the AI, even...

Find SimilarView on arXiv

All Kronecker coefficients are reduced Kronecker coefficients

May 4, 2023

83% Match
Christian Ikenmeyer, Greta Panova
Combinatorics
Computational Complexity
Representation Theory

We settle the question of where exactly the reduced Kronecker coefficients lie on the spectrum between the Littlewood-Richardson and Kronecker coefficients by showing that every Kronecker coefficient of the symmetric group is equal to a reduced Kronecker coefficient by an explicit construction. This implies the equivalence of a question by Stanley from 2000 and a question by Kirillov from 2004 about combinatorial interpretations of these two families of coefficients. Moreover...

Find SimilarView on arXiv

Sample, computation vs storage tradeoffs for classification using tensor subspace models

June 18, 2017

83% Match
Mohammadhossein Chaghazardi, Shuchin Aeron
Machine Learning
Machine Learning

In this paper, we exhibit the tradeoffs between the (training) sample, computation and storage complexity for the problem of supervised classification using signal subspace estimation. Our main tool is the use of tensor subspaces, i.e. subspaces with a Kronecker structure, for embedding the data into lower dimensions. Among the subspaces with a Kronecker structure, we show that using subspaces with a hierarchical structure for representing data leads to improved tradeoffs. On...

Find SimilarView on arXiv

Machine Learning and Computational Mathematics

September 24, 2020

83% Match
Weinan E
Numerical Analysis
Machine Learning
Numerical Analysis
Machine Learning

Neural network-based machine learning is capable of approximating functions in very high dimension with unprecedented efficiency and accuracy. This has opened up many exciting new possibilities, not just in traditional areas of artificial intelligence, but also in scientific computing and computational science. At the same time, machine learning has also acquired the reputation of being a set of "black box" type of tricks, without fundamental principles. This has been a real ...

Find SimilarView on arXiv

The co-Pieri rule for Kronecker coefficients

October 12, 2017

83% Match
C. Bowman, Visscher M. De, J. Enyang
Representation Theory
Combinatorics

A fundamental problem in the representation theory of the symmetric group, Sn, is to describe the coefficients in the decomposition of a tensor product of two simple representations. These coefficients are known in the literature as the Kronecker coefficients. The Littlewood--Richardson coefficients appear as an important subfamily of the wider class of stable Kronecker coefficients. This subfamily of coefficients can be calculated using a tableaux counting algorithm known as...

Find SimilarView on arXiv

Metric Transforms and Low Rank Matrices via Representation Theory of the Real Hyperrectangle

November 23, 2020

82% Match
Josh Alman, Timothy Chu, Gary Miller, Shyam Narayanan, ... , Song Zhao
Computational Geometry
Machine Learning
Metric Geometry

In this paper, we develop a new technique which we call representation theory of the real hyperrectangle, which describes how to compute the eigenvectors and eigenvalues of certain matrices arising from hyperrectangles. We show that these matrices arise naturally when analyzing a number of different algorithmic tasks such as kernel methods, neural network training, natural language processing, and the design of algorithms using the polynomial method. We then use our new techn...

Find SimilarView on arXiv

Machine Learning Algebraic Geometry for Physics

April 21, 2022

82% Match
Jiakang Bao, Yang-Hui He, ... , Hirst Edward
Algebraic Geometry
Machine Learning

We review some recent applications of machine learning to algebraic geometry and physics. Since problems in algebraic geometry can typically be reformulated as mappings between tensors, this makes them particularly amenable to supervised learning. Additionally, unsupervised methods can provide insight into the structure of such geometrical data. At the heart of this programme is the question of how geometry can be machine learned, and indeed how AI helps one to do mathematics...

Find SimilarView on arXiv

Weight Matrix Dimensionality Reduction in Deep Learning via Kronecker Multi-layer Architectures

April 8, 2022

82% Match
Jarom D. Hogue, Robert M. Kirby, Akil Narayan
Machine Learning
Numerical Analysis
Numerical Analysis

Deep learning using neural networks is an effective technique for generating models of complex data. However, training such models can be expensive when networks have large model capacity resulting from a large number of layers and nodes. For training in such a computationally prohibitive regime, dimensionality reduction techniques ease the computational burden, and allow implementations of more robust networks. We propose a novel type of such dimensionality reduction via a n...

Find SimilarView on arXiv

Computing Multiplicities of Lie Group Representations

April 19, 2012

82% Match
Matthias Christandl, Brent Doran, Michael Walter
Computational Complexity
Representation Theory

For fixed compact connected Lie groups H \subseteq G, we provide a polynomial time algorithm to compute the multiplicity of a given irreducible representation of H in the restriction of an irreducible representation of G. Our algorithm is based on a finite difference formula which makes the multiplicities amenable to Barvinok's algorithm for counting integral points in polytopes. The Kronecker coefficients of the symmetric group, which can be seen to be a special case of su...

Find SimilarView on arXiv

Learning in High-Dimensional Feature Spaces Using ANOVA-Based Fast Matrix-Vector Multiplication

November 19, 2021

82% Match
Franziska Nestler, Martin Stoll, Theresa Wagner
Machine Learning

Kernel matrices are crucial in many learning tasks such as support vector machines or kernel ridge regression. The kernel matrix is typically dense and large-scale. Depending on the dimension of the feature space even the computation of all of its entries in reasonable time becomes a challenging task. For such dense matrices the cost of a matrix-vector product scales quadratically with the dimensionality N , if no customized methods are applied. We propose the use of an ANOVA...

Find SimilarView on arXiv