ID: 1508.00144

Quantitative evaluation of the performance of discrete-time reservoir computers in the forecasting, filtering, and reconstruction of stochastic stationary signals

August 1, 2015

View on ArXiv

Similar papers 2

Reservoir Computing Universality With Stochastic Inputs

July 7, 2018

89% Match
Lukas Gonon, Juan-Pablo Ortega
Emerging Technologies
Neural and Evolutionary Comp...
Dynamical Systems
Probability

The universal approximation properties with respect to $L ^p $-type criteria of three important families of reservoir computers with stochastic discrete-time semi-infinite inputs is shown. First, it is proved that linear reservoir systems with either polynomial or neural network readout maps are universal. More importantly, it is proved that the same property holds for two families with linear readouts, namely, trigonometric state-affine systems and echo state networks, which...

Find SimilarView on arXiv

Signal-noise separation using unsupervised reservoir computing

April 7, 2024

89% Match
Jaesung Choi, Pilwon Kim
Machine Learning
Signal Processing
Chaotic Dynamics

Removing noise from a signal without knowing the characteristics of the noise is a challenging task. This paper introduces a signal-noise separation method based on time series prediction. We use Reservoir Computing (RC) to extract the maximum portion of "predictable information" from a given signal. Reproducing the deterministic component of the signal using RC, we estimate the noise distribution from the difference between the original signal and reconstructed one. The meth...

Find SimilarView on arXiv

Nonlinear Autoregression with Convergent Dynamics on Novel Computational Platforms

August 18, 2021

89% Match
J. Chen, H. I. Nurdin
Systems and Control
Systems and Control
Machine Learning

Nonlinear stochastic modeling is useful for describing complex engineering systems. Meanwhile, neuromorphic (brain-inspired) computing paradigms are developing to tackle tasks that are challenging and resource intensive on digital computers. An emerging scheme is reservoir computing which exploits nonlinear dynamical systems for temporal information processing. This paper introduces reservoir computers with output feedback as stationary and ergodic infinite-order nonlinear au...

Find SimilarView on arXiv

Optimizing Memory in Reservoir Computers

January 5, 2022

89% Match
Thomas L. Carroll
Neural and Evolutionary Comp...

A reservoir computer is a way of using a high dimensional dynamical system for computation. One way to construct a reservoir computer is by connecting a set of nonlinear nodes into a network. Because the network creates feedback between nodes, the reservoir computer has memory. If the reservoir computer is to respond to an input signal in a consistent way (a necessary condition for computation), the memory must be fading; that is, the influence of the initial conditions fades...

Find SimilarView on arXiv

Reservoir Computing with Noise

February 28, 2023

89% Match
Chad Nathe, Chandra Pappu, Nicholas A. Mecholsky, Joseph D. Hart, ... , Sorrentino Francesco
Neural and Evolutionary Comp...

This paper investigates in detail the effects of noise on the performance of reservoir computing. We focus on an application in which reservoir computers are used to learn the relationship between different state variables of a chaotic system. We recognize that noise can affect differently the training and testing phases. We find that the best performance of the reservoir is achieved when the strength of the noise that affects the input signal in the training phase equals the...

Find SimilarView on arXiv

Benchmarking Learning Efficiency in Deep Reservoir Computing

September 29, 2022

89% Match
Hugo Cisneros, Josef Sivic, Tomas Mikolov
Machine Learning

It is common to evaluate the performance of a machine learning model by measuring its predictive power on a test dataset. This approach favors complicated models that can smoothly fit complex functions and generalize well from training data points. Although essential components of intelligence, speed and data efficiency of this learning process are rarely reported or compared between different candidate models. In this paper, we introduce a benchmark of increasingly difficult...

Find SimilarView on arXiv

Reservoir Computing Benchmarks: a review, a taxonomy, some best practices

May 10, 2024

89% Match
Chester Wringe, Martin Trefzer, Susan Stepney
Emerging Technologies
Machine Learning
Neural and Evolutionary Comp...

Reservoir Computing is an Unconventional Computation model to perform computation on various different substrates, such as RNNs or physical materials. The method takes a "black-box" approach, training only the outputs of the system it is built on. As such, evaluating the computational capacity of these systems can be challenging. We review and critique the evaluation methods used in the field of Reservoir Computing. We introduce a categorisation of benchmark tasks. We review ...

Find SimilarView on arXiv

Memory and Information Processing in Recurrent Neural Networks

April 23, 2016

88% Match
Alireza Goudarzi, Sarah Marzen, Peter Banda, Guy Feldman, ... , Stefanovic Darko
Neural and Evolutionary Comp...

Recurrent neural networks (RNN) are simple dynamical systems whose computational power has been attributed to their short-term memory. Short-term memory of RNNs has been previously studied analytically only for the case of orthogonal networks, and only under annealed approximation, and uncorrelated input. Here for the first time, we present an exact solution to the memory capacity and the task-solving performance as a function of the structure of a given network instance, ena...

Find SimilarView on arXiv

Next Generation Reservoir Computing

June 14, 2021

88% Match
Daniel J. Gauthier, Erik Bollt, ... , Barbosa Wendson A. S.
Machine Learning
Adaptation and Self-Organizi...

Reservoir computing is a best-in-class machine learning algorithm for processing information generated by dynamical systems using observed time-series data. Importantly, it requires very small training data sets, uses linear optimization, and thus requires minimal computing resources. However, the algorithm uses randomly sampled matrices to define the underlying recurrent neural network and has a multitude of metaparameters that must be optimized. Recent results demonstrate t...

Find SimilarView on arXiv

Risk bounds for reservoir computing

October 30, 2019

88% Match
Lukas Gonon, Lyudmila Grigoryeva, Juan-Pablo Ortega
Machine Learning
Machine Learning

We analyze the practices of reservoir computing in the framework of statistical learning theory. In particular, we derive finite sample upper bounds for the generalization error committed by specific families of reservoir computing systems when processing discrete-time inputs under various hypotheses on their dependence structure. Non-asymptotic bounds are explicitly written down in terms of the multivariate Rademacher complexities of the reservoir systems and the weak depend...

Find SimilarView on arXiv