May 27, 1997
We review the theory of neural networks, as it has emerged in the last ten years or so within the physics community, emphasizing questions of biological relevance over those of importance in mathematical statistics and machine learning theory.
January 19, 2017
Introduction to deep neural networks and their history.
February 29, 2020
Deep Learning (DL) has made a major impact on data science in the last decade. This chapter introduces the basic concepts of this field. It includes both the basic structures used to design deep neural networks and a brief survey of some of its popular use cases.
December 25, 2020
In recent years, several studies have provided insight on the functioning of the brain which consists of neurons and form networks via interconnection among them by synapses. Neural networks are formed by interconnected systems of neurons, and are of two types, namely, the Artificial Neural Network (ANNs) and Biological Neural Network (interconnected nerve cells). The ANNs are computationally influenced by human neurons and are used in modelling neural systems. The reasoning ...
April 22, 2022
This book is intended for beginners who have no familiarity with deep learning. Our only expectation from readers is that they already have the basic programming skills in Python.
August 28, 2019
This paper expresses the structure of artificial neural network (ANN) as a functional form, using the activation integral concept derived from the activation function. In this way, the structure of ANN can be represented by a simple function, and it is possible to find the mathematical solutions of ANN. Thus, it can be recognized that the current ANN can be placed in a more reasonable framework. Perhaps all questions about ANN will be eliminated.
October 31, 2023
This book aims to provide an introduction to the topic of deep learning algorithms. We review essential components of deep learning algorithms in full mathematical detail including different artificial neural network (ANN) architectures (such as fully-connected feedforward ANNs, convolutional ANNs, recurrent ANNs, residual ANNs, and ANNs with batch normalization) and different optimization algorithms (such as the basic stochastic gradient descent (SGD) method, accelerated met...
January 17, 2019
These are lecture notes for a course on machine learning with neural networks for scientists and engineers that I have given at Gothenburg University and Chalmers Technical University in Gothenburg, Sweden. The material is organised into three parts: Hopfield networks, supervised learning of labeled data, and learning algorithms for unlabeled data sets. Part I introduces stochastic recurrent networks: Hopfield networks and Boltzmann machines. The analysis of their learning ru...
February 17, 2023
An object-oriented approach to implementing artificial neural networks is introduced in this article. The networks obtained in this way are highly connected in that they admit edges between nodes in any layers of the network, and dynamic, in that the insertion, or deletion, of nodes, edges or layers of nodes can be effected in a straightforward way. In addition, the activation functions of nodes need not be uniform within layers, and can also be changed within individual node...
April 11, 2023
These lecture notes provide an overview of Neural Network architectures from a mathematical point of view. Especially, Machine Learning with Neural Networks is seen as an optimization problem. Covered are an introduction to Neural Networks and the following architectures: Feedforward Neural Network, Convolutional Neural Network, ResNet, and Recurrent Neural Network.