ID: 2202.02164

Group invariant machine learning by fundamental domain projections

February 4, 2022

View on ArXiv

Similar papers 2

Group-invariant tensor train networks for supervised learning

June 30, 2022

86% Match
Brent Sprangers, Nick Vannieuwenhoven
Machine Learning
Numerical Analysis
Numerical Analysis

Invariance has recently proven to be a powerful inductive bias in machine learning models. One such class of predictive or generative models are tensor networks. We introduce a new numerical algorithm to construct a basis of tensors that are invariant under the action of normal matrix representations of an arbitrary discrete group. This method can be up to several orders of magnitude faster than previous approaches. The group-invariant tensors are then combined into a group-i...

Find SimilarView on arXiv

Machine learning for complete intersection Calabi-Yau manifolds: a methodological study

July 30, 2020

86% Match
Harold Erbin, Riccardo Finotello
Machine Learning
Algebraic Geometry

We revisit the question of predicting both Hodge numbers $h^{1,1}$ and $h^{2,1}$ of complete intersection Calabi-Yau (CICY) 3-folds using machine learning (ML), considering both the old and new datasets built respectively by Candelas-Dale-Lutken-Schimmrigk / Green-H\"ubsch-Lutken and by Anderson-Gao-Gray-Lee. In real world applications, implementing a ML system rarely reduces to feed the brute data to the algorithm. Instead, the typical workflow starts with an exploratory dat...

Find SimilarView on arXiv

Local Group Invariant Representations via Orbit Embeddings

December 6, 2016

85% Match
Anant Raj, Abhishek Kumar, Youssef Mroueh, ... , Schölkopf Bernhard
Machine Learning
Machine Learning

Invariance to nuisance transformations is one of the desirable properties of effective representations. We consider transformations that form a \emph{group} and propose an approach based on kernel methods to derive local group invariant representations. Locality is achieved by defining a suitable probability distribution over the group which in turn induces distributions in the input feature space. We learn a decision function over these distributions by appealing to the powe...

Find SimilarView on arXiv

Unsupervised Learning of Group Invariant and Equivariant Representations

February 15, 2022

85% Match
Robin Winter, Marco Bertolini, Tuan Le, ... , Clevert Djork-Arné
Machine Learning

Equivariant neural networks, whose hidden features transform according to representations of a group G acting on the data, exhibit training efficiency and an improved generalisation performance. In this work, we extend group invariant and equivariant representation learning to the field of unsupervised deep learning. We propose a general learning strategy based on an encoder-decoder framework in which the latent representation is separated in an invariant term and an equivari...

Find SimilarView on arXiv

Low Dimensional Invariant Embeddings for Universal Geometric Learning

May 5, 2022

85% Match
Nadav Dym, Steven J. Gortler
Machine Learning
Algebraic Geometry

This paper studies separating invariants: mappings on $D$ dimensional domains which are invariant to an appropriate group action, and which separate orbits. The motivation for this study comes from the usefulness of separating invariants in proving universality of equivariant neural network architectures. We observe that in several cases the cardinality of separating invariants proposed in the machine learning literature is much larger than the dimension $D$. As a result, t...

Find SimilarView on arXiv

Machine Learning in Physics and Geometry

March 22, 2023

85% Match
Yang-Hui He, Elli Heyes, Edward Hirst
Algebraic Geometry
Mathematical Physics

We survey some recent applications of machine learning to problems in geometry and theoretical physics. Pure mathematical data has been compiled over the last few decades by the community and experiments in supervised, semi-supervised and unsupervised machine learning have found surprising success. We thus advocate the programme of machine learning mathematical structures, and formulating conjectures via pattern recognition, in other words using artificial intelligence to hel...

Find SimilarView on arXiv

A General Theory of Equivariant CNNs on Homogeneous Spaces

November 5, 2018

85% Match
Taco Cohen, Mario Geiger, Maurice Weiler
Machine Learning
Artificial Intelligence
Computational Geometry
Computer Vision and Pattern ...
Machine Learning

We present a general theory of Group equivariant Convolutional Neural Networks (G-CNNs) on homogeneous spaces such as Euclidean space and the sphere. Feature maps in these networks represent fields on a homogeneous base space, and layers are equivariant maps between spaces of fields. The theory enables a systematic classification of all existing G-CNNs in terms of their symmetry group, base space, and field type. We also consider a fundamental question: what is the most gener...

Find SimilarView on arXiv

Equivariant Representation Learning via Class-Pose Decomposition

July 7, 2022

85% Match
Giovanni Luca Marchetti, Gustaf Tegnér, ... , Kragic Danica
Machine Learning
Group Theory

We introduce a general method for learning representations that are equivariant to symmetries of data. Our central idea is to decompose the latent space into an invariant factor and the symmetry group itself. The components semantically correspond to intrinsic data classes and poses respectively. The learner is trained on a loss encouraging equivariance based on supervision from relative symmetry information. The approach is motivated by theoretical results from group theory ...

Find SimilarView on arXiv

Equivariant and Invariant Reynolds Networks

October 15, 2021

85% Match
Akiyoshi Sannai, Makoto Kawano, Wataru Kumagai
Machine Learning

Invariant and equivariant networks are useful in learning data with symmetry, including images, sets, point clouds, and graphs. In this paper, we consider invariant and equivariant networks for symmetries of finite groups. Invariant and equivariant networks have been constructed by various researchers using Reynolds operators. However, Reynolds operators are computationally expensive when the order of the group is large because they use the sum over the whole group, which pos...

Find SimilarView on arXiv

Deep learning complete intersection Calabi-Yau manifolds

November 20, 2023

85% Match
Harold Erbin, Riccardo Finotello
Machine Learning
Algebraic Geometry

We review advancements in deep learning techniques for complete intersection Calabi-Yau (CICY) 3- and 4-folds, with the aim of understanding better how to handle algebraic topological data with machine learning. We first discuss methodological aspects and data analysis, before describing neural networks architectures. Then, we describe the state-of-the art accuracy in predicting Hodge numbers. We include new results on extrapolating predictions from low to high Hodge numbers,...

Find SimilarView on arXiv