ID: 2305.11141

Clifford Group Equivariant Neural Networks

May 18, 2023

View on ArXiv

Similar papers 3

Lie Group Decompositions for Equivariant Neural Networks

October 17, 2023

88% Match
Mircea Mironenco, Patrick Forré
Machine Learning
Machine Learning

Invariance and equivariance to geometrical transformations have proven to be very useful inductive biases when training (convolutional) neural network models, especially in the low-data regime. Much work has focused on the case where the symmetry group employed is compact or abelian, or both. Recent work has explored enlarging the class of transformations used to the case of Lie groups, principally through the use of their Lie algebra, as well as the group exponential and log...

Find SimilarView on arXiv

Learning Deep O($n$)-Equivariant Hyperspheres

May 25, 2023

88% Match
Pavlo Melnyk, Michael Felsberg, Mårten Wadenbäck, ... , Le Cuong
Machine Learning

This paper presents an approach to learning (deep) $n$D features equivariant under orthogonal transformations, utilizing hyperspheres and regular $n$-simplexes. Our main contributions are theoretical and tackle major challenges in geometric deep learning such as equivariance and invariance under geometric transformations. Namely, we enrich the recently developed theory of steerable 3D spherical neurons -- SO(3)-equivariant filter banks based on neurons with spherical decision...

Find SimilarView on arXiv

Clifford algebra and the projective model of homogeneous metric spaces: Foundations

July 8, 2013

87% Match
Andrey Sokolov
Metric Geometry

This paper is to serve as a key to the projective (homogeneous) model developed by Charles Gunn (arXiv:1101.4542 [math.MG]). The goal is to explain the underlying concepts in a simple language and give plenty of examples. It is targeted to physicists and engineers and the emphasis is on explanation rather than rigorous proof. The projective model is based on projective geometry and Clifford algebra. It supplements and enhances vector and matrix algebras. It also subsumes comp...

Find SimilarView on arXiv

Automatic Symmetry Discovery with Lie Algebra Convolutional Network

September 15, 2021

87% Match
Nima Dehmamy, Robin Walters, Yanchen Liu, ... , Yu Rose
Machine Learning
Artificial Intelligence
Group Theory

Existing equivariant neural networks require prior knowledge of the symmetry group and discretization for continuous groups. We propose to work with Lie algebras (infinitesimal generators) instead of Lie groups. Our model, the Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant feedforward architecture. Both CNNs a...

Find SimilarView on arXiv

Transformation Coding: Simple Objectives for Equivariant Representations

February 19, 2022

87% Match
Mehran Shakerinava, Arnab Kumar Mondal, Siamak Ravanbakhsh
Machine Learning
Artificial Intelligence

We present a simple non-generative approach to deep representation learning that seeks equivariant deep embedding through simple objectives. In contrast to existing equivariant networks, our transformation coding approach does not constrain the choice of the feed-forward layer or the architecture and allows for an unknown group action on the input space. We introduce several such transformation coding objectives for different Lie groups such as the Euclidean, Orthogonal and t...

Find SimilarView on arXiv

A Characterization Theorem for Equivariant Networks with Point-wise Activations

January 17, 2024

87% Match
Marco Pacini, Xiaowen Dong, ... , Santin Gabriele
Machine Learning
Artificial Intelligence

Equivariant neural networks have shown improved performance, expressiveness and sample complexity on symmetrical domains. But for some specific symmetries, representations, and choice of coordinates, the most common point-wise activations, such as ReLU, are not equivariant, hence they cannot be employed in the design of equivariant neural networks. The theorem we present in this paper describes all possible combinations of finite-dimensional representations, choice of coordin...

Find SimilarView on arXiv

Geometric Deep Learning and Equivariant Neural Networks

May 28, 2021

87% Match
Jan E. Gerken, Jimmy Aronsson, Oscar Carlsson, Hampus Linander, Fredrik Ohlsson, ... , Persson Daniel
Machine Learning
Computer Vision and Pattern ...

We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks. We develop gauge equivariant convolutional neural networks on arbitrary manifolds $\mathcal{M}$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles. We also discuss group equivariant neural networks for homogeneous spaces $\mathcal{M}=G/K$, which are instead equivariant with resp...

Find SimilarView on arXiv

Geometric Algebra Attention Networks for Small Point Clouds

October 5, 2021

87% Match
Matthew Spellings
Machine Learning
Computer Vision and Pattern ...

Much of the success of deep learning is drawn from building architectures that properly respect underlying symmetry and structure in the data on which they operate - a set of considerations that have been united under the banner of geometric deep learning. Often problems in the physical sciences deal with relatively small sets of points in two- or three-dimensional space wherein translation, rotation, and permutation equivariance are important or even vital for models to be u...

Find SimilarView on arXiv

e3nn: Euclidean Neural Networks

July 18, 2022

87% Match
Mario Geiger, Tess Smidt
Machine Learning
Artificial Intelligence
Neural and Evolutionary Comp...

We present e3nn, a generalized framework for creating E(3) equivariant trainable functions, also known as Euclidean neural networks. e3nn naturally operates on geometry and geometric tensors that describe systems in 3D and transform predictably under a change of coordinate system. The core of e3nn are equivariant operations such as the TensorProduct class or the spherical harmonics functions that can be composed to create more complex modules such as convolutions and attentio...

Find SimilarView on arXiv

Enabling Efficient Equivariant Operations in the Fourier Basis via Gaunt Tensor Products

January 18, 2024

87% Match
Shengjie Luo, Tianlang Chen, Aditi S. Krishnapriyan
Machine Learning
Materials Science
Group Theory
Chemical Physics
Biomolecules

Developing equivariant neural networks for the E(3) group plays an important role in modeling 3D data across real-world applications. Enforcing this equivariance primarily involves the tensor products of irreducible representations (irreps). However, the computational complexity of such operations increases significantly as higher-order tensors are used. In this work, we propose a systematic approach to substantially accelerate the computation of the tensor products of irreps...

Find SimilarView on arXiv