May 19, 2023
Similar papers 2
August 26, 2019
This chapter provides a comprehensive survey of the researches and motivations for hardware implementation of reservoir computing (RC) on neuromorphic electronic systems. Due to its computational efficiency and the fact that training amounts to a simple linear regression, both spiking and non-spiking implementations of reservoir computing on neuromorphic hardware have been developed. Here, a review of these experimental studies is provided to illustrate the progress in this a...
June 9, 2014
In the quest for alternatives to traditional CMOS, it is being suggested that digital computing efficiency and power can be improved by matching the precision to the application. Many applications do not need the high precision that is being used today. In particular, large gains in area- and power efficiency could be achieved by dedicated analog realizations of approximate computing engines. In this work, we explore the use of memristor networks for analog approximate comput...
July 26, 2013
We show that memcapacitive (memory capacitive) systems can be used as synapses in artificial neural networks. As an example of our approach, we discuss the architecture of an integrate-and-fire neural network based on memcapacitive synapses. Moreover, we demonstrate that the spike-timing-dependent plasticity can be simply realized with some of these devices. Memcapacitive synapses are a low-energy alternative to memristive synapses for neuromorphic computation.
May 22, 2024
Pushing the frontiers of time-series information processing in ever-growing edge devices with stringent resources has been impeded by the system's ability to process information and learn locally on the device. Local processing and learning typically demand intensive computations and massive storage as the process involves retrieving information and tuning hundreds of parameters back in time. In this work, we developed a memristor-based echo state network accelerator that fea...
February 27, 2025
Nanodevices that show the potential for non-linear transformation of electrical signals and various forms of memory can be successfully used in new computational paradigms, such as neuromorphic or reservoir computing (RC). Dedicated hardware implementations based on functional neuromorphic structures significantly reduce energy consumption and/or increase computational capabilities of a given artificial neural network system. Concepts of RC, which as a flexible computational ...
September 12, 2017
Emerging nanodevices such as resistive memories are being considered for hardware realizations of a variety of artificial neural networks (ANNs), including highly promising online variants of the learning approaches known as reservoir computing (RC) and the extreme learning machine (ELM). We propose an RC/ELM inspired learning system built with nanosynapses that performs both on-chip projection and regression operations. To address time-dynamic tasks, the hidden neurons of ou...
September 28, 2022
Physical reservoir computing is a computational paradigm that enables spatio-temporal pattern recognition to be performed directly in matter. The use of physical matter leads the way towards energy-efficient devices capable of solving machine learning problems without having to build a system of millions of interconnected neurons. We propose a high performance "skyrmion mixture reservoir" that implements the reservoir computing model with multi-dimensional inputs. We show tha...
December 9, 2024
Recent progresses in magnetoionics offer exciting potentials to leverage its non-linearity, short-term memory, and energy-efficiency to uniquely advance the field of physical reservoir computing. In this work, we experimentally demonstrate the classification of temporal data using a magneto-ionic (MI) heterostructure. The device was specifically engineered to induce non-linear ion migration dynamics, which in turn imparted non-linearity and short-term memory (STM) to the magn...
April 30, 2020
Machine learning, particularly in the form of deep learning, has driven most of the recent fundamental developments in artificial intelligence. Deep learning is based on computational models that are, to a certain extent, bio-inspired, as they rely on networks of connected simple computing units operating in parallel. Deep learning has been successfully applied in areas such as object/pattern recognition, speech and natural language processing, self-driving vehicles, intellig...
July 24, 2015
The advent of advanced neuronal interfaces offers great promise for linking brain functions to electronics. A major bottleneck in achieving this is real-time processing of big data that imposes excessive requirements on bandwidth, energy and computation capacity; limiting the overall number of bio-electronic links. Here, we present a novel monitoring system concept that exploits the intrinsic properties of memristors for processing neural information in real time. We demonstr...