December 1, 2021
Similar papers 5
January 10, 2020
The increasing popularity of machine learning solutions puts increasing restrictions on this field if it is to penetrate more aspects of life. In particular, energy efficiency and speed of operation is crucial, inter alia in portable medical devices. The Reservoir Computing (RC) paradigm poses as a solution to these issues through foundation of its operation: the reservoir of states. Adequate separation of input information translated into the internal state of the reservoir,...
December 11, 2019
The enormous amount of data generated nowadays worldwide is increasingly triggering the search for unconventional and more efficient ways of processing and classifying information, eventually able to transcend the conventional von-Neumann-Turing computational central dogma. It is, therefore, greatly appealing to draw inspiration from less conventional but computationally more powerful systems such as the neural architecture of the human brain. This neuromorphic route has the ...
July 18, 2024
Memristors offer significant advantages as in-memory computing devices due to their non-volatility, low power consumption, and history-dependent conductivity. These attributes are particularly valuable in the realm of neuromorphic circuits for neural networks, which currently face limitations imposed by the Von Neumann architecture and high energy demands. This study evaluates the feasibility of using memristors for in-memory processing by constructing and training three digi...
July 7, 2022
Networks of nanowires are currently being explored for a range of applications in brain-like (or neuromorphic) computing, and especially in reservoir computing (RC). Fabrication of real-world computing devices requires that the nanowires are deposited sequentially, leading to stacking of the wires on top of each other. However, most simulations of computational tasks using these systems treat the nanowires as 1D objects lying in a perfectly 2D plane - the effect of stacking o...
March 17, 2022
Memristor-based neuromorphic computing could overcome the limitations of traditional von Neumann computing architectures -- in which data are shuffled between separate memory and processing units -- and improve the performance of deep neural networks. However, this will require accurate synaptic-like device performance, and memristors typically suffer from poor yield and a limited number of reliable conductance states. Here we report floating gate memristive synaptic devices ...
February 27, 2013
Conventional neuro-computing architectures and artificial neural networks have often been developed with no or loose connections to neuroscience. As a consequence, they have largely ignored key features of biological neural processing systems, such as their extremely low-power consumption features or their ability to carry out robust and efficient computation using massively parallel arrays of limited precision, highly variable, and unreliable components. Recent developments ...
November 29, 2022
Brain-inspired computing proposes a set of algorithmic principles that hold promise for advancing artificial intelligence. They endow systems with self learning capabilities, efficient energy usage, and high storage capacity. A core concept that lies at the heart of brain computation is sequence learning and prediction. This form of computation is essential for almost all our daily tasks such as movement generation, perception, and language. Understanding how the brain perfor...
July 1, 2018
The volume, veracity, variability, and velocity of data produced from the ever-increasing network of sensors connected to Internet pose challenges for power management, scalability, and sustainability of cloud computing infrastructure. Increasing the data processing capability of edge computing devices at lower power requirements can reduce several overheads for cloud computing solutions. This paper provides the review of neuromorphic CMOS-memristive architectures that can be...
April 12, 2022
Memristive devices are a class of circuit elements that shows great promise as future building block for brain-inspired computing. One influential view in theoretical neuroscience sees the brain as a function-computing device: given input signals, the brain applies a function in order to generate new internal states and motor outputs. Therefore, being able to approximate functions is a fundamental axiom to build upon for future brain research and to derive more efficient comp...
August 21, 2011
We present new computational building blocks based on memristive devices. These blocks, can be used to implement either supervised or unsupervised learning modules. This is achieved using a crosspoint architecture which is an efficient array implementation for nanoscale two-terminal memristive devices. Based on these blocks and an experimentally verified SPICE macromodel for the memristor, we demonstrate that firstly, the Spike-Timing-Dependent Plasticity (STDP) can be implem...