Re-visiting the echo state property
Research output: Contribution to journal › Research article › Contributed › peer-review
Contributors
Abstract
An echo state network (ESN) consists of a large, randomly connected neural network, the reservoir, which is driven by an input signal and projects to output units. During training, only the connections from the reservoir to these output units are learned. A key requisite for output-only training is the echo state property (ESP), which means that the effect of initial conditions should vanish as time passes. In this paper, we use analytical examples to show that a widely used criterion for the ESP, the spectral radius of the weight matrix being smaller than unity, is not sufficient to satisfy the echo state property. We obtain these examples by investigating local bifurcation properties of the standard ESNs. Moreover, we provide new sufficient conditions for the echo state property of standard sigmoid and leaky integrator ESNs. We furthermore suggest an improved technical definition of the echo state property, and discuss what practicians should (and should not) observe when they optimize their reservoirs for specific tasks.
Details
Original language | English |
---|---|
Pages (from-to) | 1-9 |
Number of pages | 9 |
Journal | Neural Networks |
Volume | 35 |
Publication status | Published - Nov 2012 |
Peer-reviewed | Yes |
Externally published | Yes |
External IDs
PubMed | 22885243 |
---|
Keywords
ASJC Scopus subject areas
Keywords
- Bifurcation, Diagonally Schur stable, Echo state network, Lyapunov, Spectral radius