ID: 2312.08550

Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks

December 13, 2023

View on ArXiv

Similar papers 5

Machine learning and invariant theory

September 29, 2022

88% Match
Ben Blum-Smith, Soledad Villar
Machine Learning
Machine Learning

Inspired by constraints from physical law, equivariant machine learning restricts the learning to a hypothesis class where all the functions are equivariant with respect to some group action. Irreducible representations or invariant theory are typically used to parameterize the space of such functions. In this article, we introduce the topic and explain a couple of methods to explicitly parameterize equivariant functions that are being used in machine learning applications. I...

Find SimilarView on arXiv

Probabilistic symmetries and invariant neural networks

January 18, 2019

88% Match
Benjamin Bloem-Reddy, Yee Whye Teh
Machine Learning
Machine Learning

Treating neural network inputs and outputs as random variables, we characterize the structure of neural networks that can be used to model data that are invariant or equivariant under the action of a compact group. Much recent research has been devoted to encoding invariance under symmetry transformations into neural network architectures, in an effort to improve the performance of deep neural networks in data-scarce, non-i.i.d., or unsupervised settings. By considering group...

Find SimilarView on arXiv

Symmetry Group Equivariant Architectures for Physics

March 11, 2022

88% Match
Alexander Bogatskiy, Sanmay Ganguly, Thomas Kipf, Risi Kondor, David W. Miller, Daniel Murnane, Jan T. Offermann, Mariel Pettee, Phiala Shanahan, ... , Thais Savannah
Machine Learning
Instrumentation and Methods ...
Artificial Intelligence

Physical theories grounded in mathematical symmetries are an essential component of our understanding of a wide range of properties of the universe. Similarly, in the domain of machine learning, an awareness of symmetries such as rotation or permutation invariance has driven impressive performance breakthroughs in computer vision, natural language processing, and other important applications. In this report, we argue that both the physics community and the broader machine lea...

Find SimilarView on arXiv

Harmonic Networks: Integrating Spectral Information into CNNs

December 7, 2018

88% Match
Matej Ulicny, Vladimir A. Krylov, Rozenn Dahyot
Computer Vision and Pattern ...
Machine Learning

Convolutional neural networks (CNNs) learn filters in order to capture local correlation patterns in feature space. In contrast, in this paper we propose harmonic blocks that produce features by learning optimal combinations of spectral filters defined by the Discrete Cosine Transform. The harmonic blocks are used to replace conventional convolutional layers to construct partial or fully harmonic CNNs. We extensively validate our approach and show that the introduction of har...

Find SimilarView on arXiv

Equivariant Architectures for Learning in Deep Weight Spaces

January 30, 2023

87% Match
Aviv Navon, Aviv Shamsian, Idan Achituve, Ethan Fetaya, ... , Maron Haggai
Machine Learning

Designing machine learning architectures for processing neural networks in their raw weight matrix form is a newly introduced research direction. Unfortunately, the unique symmetry structure of deep weight spaces makes this design very challenging. If successful, such architectures would be capable of performing a wide range of intriguing tasks, from adapting a pre-trained network to a new domain to editing objects represented as functions (INRs or NeRFs). As a first step tow...

Find SimilarView on arXiv

A Fine-Grained Spectral Perspective on Neural Networks

July 24, 2019

87% Match
Greg Yang, Hadi Salman
Machine Learning
Neural and Evolutionary Comp...
Machine Learning

Are neural networks biased toward simple functions? Does depth always help learn more complex features? Is training the last layer of a network as good as training all layers? How to set the range for learning rate tuning? These questions seem unrelated at face value, but in this work we give all of them a common treatment from the spectral perspective. We will study the spectra of the *Conjugate Kernel, CK,* (also called the *Neural Network-Gaussian Process Kernel*), and the...

Find SimilarView on arXiv

A Unified Framework to Enforce, Discover, and Promote Symmetry in Machine Learning

November 1, 2023

87% Match
Samuel E. Otto, Nicholas Zolman, ... , Brunton Steven L.
Machine Learning
Numerical Analysis
Differential Geometry
Numerical Analysis

Symmetry is present throughout nature and continues to play an increasingly central role in physics and machine learning. Fundamental symmetries, such as Poincar\'{e} invariance, allow physical laws discovered in laboratories on Earth to be extrapolated to the farthest reaches of the universe. Symmetry is essential to achieving this extrapolatory power in machine learning applications. For example, translation invariance in image classification allows models with fewer parame...

Find SimilarView on arXiv

Breaking the Curse of Dimensionality in Deep Neural Networks by Learning Invariant Representations

October 24, 2023

87% Match
Leonardo Petrini
Machine Learning

Artificial intelligence, particularly the subfield of machine learning, has seen a paradigm shift towards data-driven models that learn from and adapt to data. This has resulted in unprecedented advancements in various domains such as natural language processing and computer vision, largely attributed to deep learning, a special class of machine learning models. Deep learning arguably surpasses traditional approaches by learning the relevant features from raw data through a s...

Find SimilarView on arXiv

Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains

June 18, 2020

87% Match
Matthew Tancik, Pratul P. Srinivasan, Ben Mildenhall, Sara Fridovich-Keil, Nithin Raghavan, Utkarsh Singhal, Ravi Ramamoorthi, ... , Ng Ren
Computer Vision and Pattern ...
Machine Learning

We show that passing input points through a simple Fourier feature mapping enables a multilayer perceptron (MLP) to learn high-frequency functions in low-dimensional problem domains. These results shed light on recent advances in computer vision and graphics that achieve state-of-the-art results by using MLPs to represent complex 3D objects and scenes. Using tools from the neural tangent kernel (NTK) literature, we show that a standard MLP fails to learn high frequencies both...

Find SimilarView on arXiv

Unsupervised Learning of Group Invariant and Equivariant Representations

February 15, 2022

87% Match
Robin Winter, Marco Bertolini, Tuan Le, ... , Clevert Djork-Arné
Machine Learning

Equivariant neural networks, whose hidden features transform according to representations of a group G acting on the data, exhibit training efficiency and an improved generalisation performance. In this work, we extend group invariant and equivariant representation learning to the field of unsupervised deep learning. We propose a general learning strategy based on an encoder-decoder framework in which the latent representation is separated in an invariant term and an equivari...

Find SimilarView on arXiv