ID: 2502.01060

Learning Nonlinearity of Boolean Functions: An Experimentation with Neural Networks

February 3, 2025

View on ArXiv

Similar papers 2

Rethinking Arithmetic for Deep Neural Networks

May 7, 2019

86% Match
George A. Constantinides
Machine Learning
Hardware Architecture
Neural and Evolutionary Comp...
Machine Learning

We consider efficiency in the implementation of deep neural networks. Hardware accelerators are gaining interest as machine learning becomes one of the drivers of high-performance computing. In these accelerators, the directed graph describing a neural network can be implemented as a directed graph describing a Boolean circuit. We make this observation precise, leading naturally to an understanding of practical neural networks as discrete functions, and show that so-called bi...

Find SimilarView on arXiv

Algorithms for Verifying Deep Neural Networks

March 15, 2019

86% Match
Changliu Liu, Tomer Arnon, Christopher Lazarus, Christopher Strong, ... , Kochenderfer Mykel J.
Machine Learning
Machine Learning

Deep neural networks are widely used for nonlinear function approximation with applications ranging from computer vision to control. Although these networks involve the composition of simple arithmetic operations, it can be very challenging to verify whether a particular network satisfies certain input-output properties. This article surveys methods that have emerged recently for soundly verifying such properties. These methods borrow insights from reachability analysis, opti...

Find SimilarView on arXiv

Expressivity of Deep Neural Networks

July 9, 2020

86% Match
Ingo Gühring, Mones Raslan, Gitta Kutyniok
Machine Learning
Functional Analysis
Machine Learning

In this review paper, we give a comprehensive overview of the large variety of approximation results for neural networks. Approximation rates for classical function spaces as well as benefits of deep neural networks over shallow ones for specifically structured function classes are discussed. While the mainbody of existing results is for general feedforward architectures, we also depict approximation results for convolutional, residual and recurrent neural networks.

Find SimilarView on arXiv
Alexander Mozeika, Bo Li, David Saad
Machine Learning
Disordered Systems and Neura...
Machine Learning

We study the space of functions computed by random-layered machines, including deep neural networks and Boolean circuits. Investigating the distribution of Boolean functions computed on the recurrent and layer-dependent architectures, we find that it is the same in both models. Depending on the initial conditions and computing elements used, we characterize the space of functions computed at the large depth limit and show that the macroscopic entropy of Boolean functions is e...

NN2Poly: A polynomial representation for deep feed-forward artificial neural networks

December 21, 2021

86% Match
Pablo 1 and 2 Morala, Jenny Alexandra 1 and 2 Cifuentes, ... , Ucar Iñaki 1 and 2
Machine Learning
Machine Learning

Interpretability of neural networks and their underlying theoretical behavior remain an open field of study even after the great success of their practical applications, particularly with the emergence of deep learning. In this work, NN2Poly is proposed: a theoretical approach to obtain an explicit polynomial model that provides an accurate representation of an already trained fully-connected feed-forward artificial neural network (a multilayer perceptron or MLP). This approa...

Find SimilarView on arXiv

Representing Boolean Functions Using Polynomials: More Can Offer Less

July 2, 2013

85% Match
Yi Ming Zou
Computational Complexity
Combinatorics
Number Theory

Polynomial threshold gates are basic processing units of an artificial neural network. When the input vectors are binary vectors, these gates correspond to Boolean functions and can be analyzed via their polynomial representations. In practical applications, it is desirable to find a polynomial representation with the smallest number of terms possible, in order to use the least possible number of input lines to the unit under consideration. For this purpose, instead of an exa...

Find SimilarView on arXiv

Nonlinearity Computation for Sparse Boolean Functions

May 4, 2013

85% Match
Çağdaş Çalık
Information Theory
Information Theory

An algorithm for computing the nonlinearity of a Boolean function from its algebraic normal form (ANF) is proposed. By generalizing the expression of the weight of a Boolean function in terms of its ANF coefficients, a formulation of the distances to linear functions is obtained. The special structure of these distances can be exploited to reduce the task of nonlinearity computation to solving an associated binary integer programming problem. The proposed algorithm can be use...

Find SimilarView on arXiv

Nonlinearity of Boolean functions: an algorithmic approach based on multivariate polynomials

April 10, 2014

85% Match
E. Bellini, I. Simonetti, M. Sala
Information Theory
Information Theory

We compute the nonlinearity of Boolean functions with Groebner basis techniques, providing two algorithms: one over the binary field and the other over the rationals. We also estimate their complexity. Then we show how to improve our rational algorithm, arriving at a worst-case complexity of $O(n2^n)$ operations over the integers, that is, sums and doublings. This way, with a different approach, we reach the same complexity of established algorithms, such as those based on th...

Find SimilarView on arXiv

xRAI: Explainable Representations through AI

December 10, 2020

85% Match
Christiann Bartelt, Sascha Marton, Heiner Stuckenschmidt
Artificial Intelligence
Machine Learning

We present xRAI an approach for extracting symbolic representations of the mathematical function a neural network was supposed to learn from the trained network. The approach is based on the idea of training a so-called interpretation network that receives the weights and biases of the trained network as input and outputs the numerical representation of the function the network was supposed to learn that can be directly translated into a symbolic representation. We show that ...

Find SimilarView on arXiv

Boolformer: Symbolic Regression of Logic Functions with Transformers

September 21, 2023

85% Match
Stéphane d'Ascoli, Samy Bengio, ... , Abbé Emmanuel
Machine Learning
Logic in Computer Science

In this work, we introduce Boolformer, the first Transformer architecture trained to perform end-to-end symbolic regression of Boolean functions. First, we show that it can predict compact formulas for complex functions which were not seen during training, when provided a clean truth table. Then, we demonstrate its ability to find approximate expressions when provided incomplete and noisy observations. We evaluate the Boolformer on a broad set of real-world binary classificat...

Find SimilarView on arXiv