November 6, 2018
Artificial neural networks are the heart of machine learning algorithms and artificial intelligence protocols. Historically, the simplest implementation of an artificial neuron traces back to the classical Rosenblatt's `perceptron', but its long term practical applications may be hindered by the fast scaling up of computational complexity, especially relevant for the training of multilayered perceptron networks. Here we introduce a quantum information-based algorithm implemen...
August 29, 2014
With the overwhelming success in the field of quantum information in the last decades, the "quest" for a Quantum Neural Network (QNN) model began in order to combine quantum computing with the striking properties of neural computing. This article presents a systematic approach to QNN research, which so far consists of a conglomeration of ideas and proposals. It outlines the challenge of combining the nonlinear, dissipative dynamics of neural computing and the linear, unitary ...
October 30, 2018
The performance of a neural network for a given task is largely determined by the initial calibration of the network parameters. Yet, it has been shown that the calibration, also referred to as training, is generally NP-complete. This includes networks with binary weights, an important class of networks due to their practical hardware implementations. We therefore suggest an alternative approach to training binary neural networks. It utilizes a quantum superposition of weight...
December 1, 2020
Deep neural network powered artificial intelligence has rapidly changed our daily life with various applications. However, as one of the essential steps of deep neural networks, training a heavily weighted network requires a tremendous amount of computing resources. Especially in the post-Moore's Law era, the limit of semiconductor fabrication technology has restricted the development of learning algorithms to cope with the increasing high-intensity training data. Meanwhile, ...
February 14, 2024
In this paper, we present a novel framework for enhancing the performance of Quanvolutional Neural Networks (QuNNs) by introducing trainable quanvolutional layers and addressing the critical challenges associated with them. Traditional quanvolutional layers, although beneficial for feature extraction, have largely been static, offering limited adaptability. Unlike state-of-the-art, our research overcomes this limitation by enabling training within these layers, significantly ...
February 14, 2024
Due to the exponential growth of the Hilbert space dimension with system size, the simulation of quantum many-body systems has remained a persistent challenge until today. Here, we review a relatively new class of variational states for the simulation of such systems, namely neural quantum states (NQS), which overcome the exponential scaling by compressing the state in terms of the network parameters rather than storing all exponentially many coefficients needed for an exact ...
November 23, 2022
This paper presents, via an explicit example with a real-world dataset, a hands-on introduction to the field of quantum machine learning (QML). We focus on the case of learning with a single qubit, using data re-uploading techniques. After a discussion of the relevant background in quantum computing and machine learning we provide a thorough explanation of the data re-uploading models that we consider, and implement the different proposed formulations in toy and real-world da...
June 26, 2018
We introduce the Backwards Quantum Propagation of Phase errors (Baqprop) principle, a central theme upon which we construct multiple universal optimization heuristics for training both parametrized quantum circuits and classical deep neural networks on a quantum computer. Baqprop encodes error information in relative phases of a quantum wavefunction defined over the space of network parameters; it can be thought of as the unification of the phase kickback principle of quantum...
July 19, 1998
This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum associative memory with a capacity exponential in the number of neurons....
August 28, 2018
This paper proposes a quantum-classical algorithm to evaluate and select classical artificial neural networks architectures. The proposed algorithm is based on a probabilistic quantum memory and the possibility to train artificial neural networks in superposition. We obtain an exponential quantum speedup in the evaluation of neural networks. We also verify experimentally through a reduced experimental analysis that the proposed algorithm can be used to select near-optimal neu...