ID: 2305.11141

Clifford Group Equivariant Neural Networks

May 18, 2023

View on ArXiv

Similar papers 2

A General Framework for Equivariant Neural Networks on Reductive Lie Groups

May 31, 2023

89% Match
Ilyes Batatia, Mario Geiger, Jose Munoz, Tess Smidt, ... , Ortner Christoph
Machine Learning
Machine Learning

Reductive Lie Groups, such as the orthogonal groups, the Lorentz group, or the unitary groups, play essential roles across scientific fields as diverse as high energy physics, quantum mechanics, quantum chromodynamics, molecular dynamics, computer vision, and imaging. In this paper, we present a general Equivariant Neural Network architecture capable of respecting the symmetries of the finite-dimensional representations of any reductive Lie Group G. Our approach generalizes t...

Find SimilarView on arXiv

Categorification of Group Equivariant Neural Networks

April 27, 2023

89% Match
Edward Pearce-Crump
Machine Learning
Combinatorics
Representation Theory
Machine Learning

We present a novel application of category theory for deep learning. We show how category theory can be used to understand and work with the linear layer functions of group equivariant neural networks whose layers are some tensor power space of $\mathbb{R}^{n}$ for the groups $S_n$, $O(n)$, $Sp(n)$, and $SO(n)$. By using category theoretic constructions, we build a richer structure that is not seen in the original formulation of these neural networks, leading to new insights....

Find SimilarView on arXiv

Clifford Group Equivariant Simplicial Message Passing Networks

February 15, 2024

89% Match
Cong Liu, David Ruhe, ... , Forré Patrick
Artificial Intelligence

We introduce Clifford Group Equivariant Simplicial Message Passing Networks, a method for steerable E(n)-equivariant message passing on simplicial complexes. Our method integrates the expressivity of Clifford group-equivariant layers with simplicial message passing, which is topologically more intricate than regular graph message passing. Clifford algebras include higher-order objects such as bivectors and trivectors, which express geometric features (e.g., areas, volumes) de...

Find SimilarView on arXiv

Beyond permutation equivariance in graph networks

March 25, 2021

88% Match
Emma Slade, Francesco Farina
Machine Learning

In this draft paper, we introduce a novel architecture for graph networks which is equivariant to the Euclidean group in $n$-dimensions. The model is designed to work with graph networks in their general form and can be shown to include particular variants as special cases. Thanks to its equivariance properties, we expect the proposed model to be more data efficient with respect to classical graph architectures and also intrinsically equipped with a better inductive bias. We ...

Find SimilarView on arXiv

Geometric Algebra Transformer

May 28, 2023

88% Match
Johann Brehmer, Haan Pim de, ... , Cohen Taco
Machine Learning
Robotics
Machine Learning

Problems involving geometric data arise in physics, chemistry, robotics, computer vision, and many other fields. Such data can take numerous forms, for instance points, direction vectors, translations, or rotations, but to date there is no single architecture that can be applied to such a wide variety of geometric types while respecting their symmetries. In this paper we introduce the Geometric Algebra Transformer (GATr), a general-purpose architecture for geometric data. GAT...

Find SimilarView on arXiv

In Search of Projectively Equivariant Networks

September 29, 2022

88% Match
Georg Bökman, Axel Flinth, Fredrik Kahl
Computer Vision and Pattern ...

Equivariance of linear neural network layers is well studied. In this work, we relax the equivariance condition to only be true in a projective sense. We propose a way to construct a projectively equivariant neural network through building a standard equivariant network where the linear group representations acting on each intermediate feature space are "multiplicatively modified lifts" of projective group representations. By theoretically studying the relation of projectivel...

Find SimilarView on arXiv

Any-dimensional equivariant neural networks

June 10, 2023

88% Match
Eitan Levin, Mateo Díaz
Machine Learning
Representation Theory
Machine Learning

Traditional supervised learning aims to learn an unknown mapping by fitting a function to a set of input-output pairs with a fixed dimension. The fitted function is then defined on inputs of the same dimension. However, in many settings, the unknown mapping takes inputs in any dimension; examples include graph parameters defined on graphs of any size and physics quantities defined on an arbitrary number of particles. We leverage a newly-discovered phenomenon in algebraic topo...

Find SimilarView on arXiv

Symmetry Group Equivariant Architectures for Physics

March 11, 2022

88% Match
Alexander Bogatskiy, Sanmay Ganguly, Thomas Kipf, Risi Kondor, David W. Miller, Daniel Murnane, Jan T. Offermann, Mariel Pettee, Phiala Shanahan, ... , Thais Savannah
Machine Learning
Instrumentation and Methods ...
Artificial Intelligence

Physical theories grounded in mathematical symmetries are an essential component of our understanding of a wide range of properties of the universe. Similarly, in the domain of machine learning, an awareness of symmetries such as rotation or permutation invariance has driven impressive performance breakthroughs in computer vision, natural language processing, and other important applications. In this report, we argue that both the physics community and the broader machine lea...

Find SimilarView on arXiv

Euclidean, Projective, Conformal: Choosing a Geometric Algebra for Equivariant Transformers

November 8, 2023

88% Match
Haan Pim de, Taco Cohen, Johann Brehmer
Machine Learning
Artificial Intelligence

The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra. We generalize this architecture into a blueprint that allows one to construct a scalable transformer architecture given any geometric (or Clifford) algebra. We study versions of this architecture for Euclidean, projective, and conformal algebras, all of which are suited to represent 3D data, and evaluate them in theory and practice. The simpl...

Find SimilarView on arXiv

What is an equivariant neural network?

May 15, 2022

88% Match
Lek-Heng Lim, Bradley J. Nelson
Machine Learning
Representation Theory
Machine Learning

We explain equivariant neural networks, a notion underlying breakthroughs in machine learning from deep convolutional neural networks for computer vision to AlphaFold 2 for protein structure prediction, without assuming knowledge of equivariance or neural networks. The basic mathematical ideas are simple but are often obscured by engineering complications that come with practical realizations. We extract and focus on the mathematical aspects, and limit ourselves to a cursory ...

Find SimilarView on arXiv