ID: 2502.01060

Learning Nonlinearity of Boolean Functions: An Experimentation with Neural Networks

February 3, 2025

View on ArXiv
Sriram Ranga, Nandish Chattopadhyay, Anupam Chattopadhyay
Computer Science
Machine Learning
Artificial Intelligence
Cryptography and Security

This paper investigates the learnability of the nonlinearity property of Boolean functions using neural networks. We train encoder style deep neural networks to learn to predict the nonlinearity of Boolean functions from examples of functions in the form of a truth table and their corresponding nonlinearity values. We report empirical results to show that deep neural networks are able to learn to predict the property for functions in 4 and 5 variables with an accuracy above 95%. While these results are positive and a disciplined analysis is being presented for the first time in this regard, we should also underline the statutory warning that it seems quite challenging to extend the idea to higher number of variables, and it is also not clear whether one can get advantage in terms of time and space complexity over the existing combinatorial algorithms.

Similar papers 1

Understanding Boolean Function Learnability on Deep Neural Networks: PAC Learning Meets Neurosymbolic Models

September 13, 2020

89% Match
Marcio Nicolau, Anderson R. Tavares, Zhiwei Zhang, Pedro Avelar, João M. Flach, ... , Vardi Moshe Y.
Machine Learning
Machine Learning

Computational learning theory states that many classes of boolean formulas are learnable in polynomial time. This paper addresses the understudied subject of how, in practice, such formulas can be learned by deep neural networks. Specifically, we analyze boolean formulas associated with model-sampling benchmarks, combinatorial optimization problems, and random 3-CNFs with varying degrees of constrainedness. Our experiments indicate that: (i) neural learning generalizes better...

Find SimilarView on arXiv

Learning Boolean Circuits with Neural Networks

October 25, 2019

88% Match
Eran Malach, Shai Shalev-Shwartz
Machine Learning
Machine Learning

While on some natural distributions, neural-networks are trained efficiently using gradient-based algorithms, it is known that learning them is computationally hard in the worst-case. To separate hard from easy to learn distributions, we observe the property of local correlation: correlation between local patterns of the input and the target label. We focus on learning deep neural-networks using a gradient-based algorithm, when the target function is a tree-structured Boolean...

Find SimilarView on arXiv

Adaptive n-ary Activation Functions for Probabilistic Boolean Logic

March 16, 2022

87% Match
Jed A. Duersch, Thomas A. Catanach, Niladri Das
Machine Learning
Artificial Intelligence

Balancing model complexity against the information contained in observed data is the central challenge to learning. In order for complexity-efficient models to exist and be discoverable in high dimensions, we require a computational framework that relates a credible notion of complexity to simple parameter representations. Further, this framework must allow excess complexity to be gradually removed via gradient-based optimization. Our n-ary, or n-argument, activation function...

Find SimilarView on arXiv

Learning Algorithms via Neural Logic Networks

April 2, 2019

87% Match
Ali Payani, Faramarz Fekri
Machine Learning
Artificial Intelligence

We propose a novel learning paradigm for Deep Neural Networks (DNN) by using Boolean logic algebra. We first present the basic differentiable operators of a Boolean system such as conjunction, disjunction and exclusive-OR and show how these elementary operators can be combined in a simple and meaningful way to form Neural Logic Networks (NLNs). We examine the effectiveness of the proposed NLN framework in learning Boolean functions and discrete-algorithmic tasks. We demonstra...

Find SimilarView on arXiv

A measure for the complexity of Boolean functions related to their implementation in neural networks

November 9, 2001

87% Match
Leonardo Franco
Disordered Systems and Neura...
Neurons and Cognition

We define a measure for the complexity of Boolean functions related to their implementation in neural networks, and in particular close related to the generalization ability that could be obtained through the learning process. The measure is computed through the calculus of the number of neighbor examples that differ in their output value. Pairs of these examples have been previously shown to be part of the minimum size training set needed to obtain perfect generalization in ...

Find SimilarView on arXiv

A Systematic Evaluation of Evolving Highly Nonlinear Boolean Functions in Odd Sizes

February 15, 2024

87% Match
Claude Carlet, Marko Ðurasevic, Domagoj Jakobovic, ... , Mariot Luca
Neural and Evolutionary Comp...
Cryptography and Security

Boolean functions are mathematical objects used in diverse applications. Different applications also have different requirements, making the research on Boolean functions very active. In the last 30 years, evolutionary algorithms have been shown to be a strong option for evolving Boolean functions in different sizes and with different properties. Still, most of those works consider similar settings and provide results that are mostly interesting from the evolutionary algorith...

Find SimilarView on arXiv

Digging Deeper: Operator Analysis for Optimizing Nonlinearity of Boolean Functions

February 12, 2023

87% Match
Marko Djurasevic, Domagoj Jakobovic, ... , Picek Stjepan
Neural and Evolutionary Comp...
Cryptography and Security

Boolean functions are mathematical objects with numerous applications in domains like coding theory, cryptography, and telecommunications. Finding Boolean functions with specific properties is a complex combinatorial optimization problem where the search space grows super-exponentially with the number of input variables. One common property of interest is the nonlinearity of Boolean functions. Constructing highly nonlinear Boolean functions is difficult as it is not always kn...

Find SimilarView on arXiv

Engineering an Efficient Boolean Functional Synthesis Engine

August 12, 2021

86% Match
Priyanka Golia, Friedrich Slivovsky, ... , Meel Kuldeep S.
Artificial Intelligence
Machine Learning
Logic in Computer Science

Given a Boolean specification between a set of inputs and outputs, the problem of Boolean functional synthesis is to synthesise each output as a function of inputs such that the specification is met. Although the past few years have witnessed intense algorithmic development, accomplishing scalability remains the holy grail. The state-of-the-art approach combines machine learning and automated reasoning to efficiently synthesise Boolean functions. In this paper, we propose fou...

Find SimilarView on arXiv

Verifying Properties of Binarized Deep Neural Networks

September 19, 2017

86% Match
Nina Narodytska, Shiva Prasad Kasiviswanathan, Leonid Ryzhyk, ... , Walsh Toby
Machine Learning
Artificial Intelligence
Cryptography and Security
Machine Learning

Understanding properties of deep neural networks is an important challenge in deep learning. In this paper, we take a step in this direction by proposing a rigorous way of verifying properties of a popular class of neural networks, Binarized Neural Networks, using the well-developed means of Boolean satisfiability. Our main contribution is a construction that creates a representation of a binarized neural network as a Boolean formula. Our encoding is the first exact Boolean r...

Find SimilarView on arXiv

A Scalable, Interpretable, Verifiable & Differentiable Logic Gate Convolutional Neural Network Architecture From Truth Tables

August 18, 2022

86% Match
Adrien Benamira, Tristan Guérand, Thomas Peyrin, ... , Hooi Bryan
Artificial Intelligence
Formal Languages and Automat...
Machine Learning
Symbolic Computation

We propose $\mathcal{T}$ruth $\mathcal{T}$able net ($\mathcal{TT}$net), a novel Convolutional Neural Network (CNN) architecture that addresses, by design, the open challenges of interpretability, formal verification, and logic gate conversion. $\mathcal{TT}$net is built using CNNs' filters that are equivalent to tractable truth tables and that we call Learning Truth Table (LTT) blocks. The dual form of LTT blocks allows the truth tables to be easily trained with gradient desc...

Find SimilarView on arXiv