Abstract
Echo state networks (ESNs), a special class of recurrent neural networks (RNNs), have attracted extensive attention in time series prediction problems. Nevertheless, the memory ability of ESNs is contradictory to nonlinear mapping, which limits the prediction performance of the network on complex time series. To balance the memory ability and the nonlinear mapping, an improved ESN model is proposed, named memory augmented echo state network (MA-ESN). When designing MA-ESN, both linear memory modules and nonlinear mapping modules are introduced into the reservoir in a new way of combination. The linear memory module improves the memory ability, while the nonlinear mapping module retains the nonlinear mapping of the network. Meanwhile, the echo state property of MA-ESN has been analyzed in theory. Finally, we have evaluated the memory ability and prediction performance of the proposed MA-ESN on benchmark time series data sets. The related experimental results demonstrate that the MA-ESN model outperforms some similar ESN models with a special memory mechanism.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The data that support the findings of this study are available from the corresponding author (F. J. Li) upon reasonable request.
References
Zhou L, Wang HW (2022) Multihorizons transfer strategy for continuous online prediction of time-series data in complex systems. Int J Intell Syst 37(10):7706–7735
Schafer AM, Zimmermann HG (2007) Recurrent neural networks are universal approximators. Int J Neural Syst 17(4):253–263
Jaeger H (2001) The ‘echo state’ approach to analysing and training recurrent neural networks-with an erratum note. German Natl Res Center Inf Technol GMD Techn Report 148(34):13
Li Y, Li FJ (2019) PSO-based growing echo state network. Appl Soft Comput 85:105774
Chen Q, Jin YC, Song YD (2022) Fault-tolerant adaptive tracking control of Euler-Lagrange systems—An echo state network approach driven by reinforcement learning. Neurocomputing 484:109–116
Ibrahim H, Loo CK, Alnajjar F (2022) Bidirectional parallel echo state network for speech emotion recognition. Neural Comput Appl 34(20):17581–17599
Li L, Pu YF, Luo ZY (2022) Distributed functional link adaptive filtering for nonlinear graph signal processing. Digital Signal Process 128:103558
Zhang L, Ye F, Xie KY et al (2022) An integrated intelligent modeling and simulation language for model-based systems engineering. J Ind Inf Integr 28:100347
Jaeger H (2002) Short term memory in echo state networks. GMD-Report 152. Technical Report
Holzmann G, Hauser H (2010) Echo state networks with filter neurons and a delay & sum readout. Neural Netw 23(2):244–256
Lun SX, Yao XS, Hu HF (2016) A new echo state network with variable memory length. Inf Sci 370:103–119
Dong L, Zhang HJ, Yang K, Zhou DL, Shi JY, Ma JH. Crowd counting by using Top-k relations: a mixed ground-truth CNN framework. IEEE Trans Consumer Electron 68(3):307–316
Jaeger H, Lukosevicius M, Popovici D, Siewert U (2007) Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw 20(3):335–352
Zheng KH, Qian B, Li S, Xiao Y, Zhuang WQ, Ma QL (2020) Long-short term echo state network for time series prediction. IEEE Access 8:91961–91974
Marzen S (2017) Difference between memory and prediction in linear recurrent networks. Phys Rev E 96(3):032308
Verstraeten D, Dambre J, Dutoit X, Schrauwen B (2010) Memory versus non-linearity in reservoirs,” The 2010 International Joint Conference on Neural Networks (IJCNN), 1–8
Bacciu D, Carta A, Sperduti A (2019) Linear memory networks. ICANN 2019: Theoretical Neural Computation. 513–525
Butcher JB, Verstraeten D, Schrauwen B, Day CR, Haycock PW (2013) Reservoir computing and extreme learning machines for non-linear time-series data analysis. Neural Netw 38:76–89
Inubushi M, Yoshimura K (2017) Reservoir computing beyond memory-nonlinearity trade-off. Sci Rep 7(1):1–10
Gil-Alana LA (2004) Long memory behaviour in the daily maximum and minimum temperatures in Melbourne, Australia. Meteorol Appl 11(4):319–328
WuZ, Jiang R (2023) Time-series benchmarks based on frequency features for fair comparative evaluation. Neural Comput Appl 1–13
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780
Wu Z, Li Q, Zhang H (2021) Chain-structure echo state network with stochastic optimization: methodology and application. IEEE Trans Neural Netw Learn Syst 33(5):1974–1985
Wu Z, Jiang RQ (2023) Time-series benchmarks based on frequency features for fair comparative evaluation. Neural Comput Appl 35(23):17029–17041
Acknowledgements
This work was supported by the National Natural Science Foundation of China under Grant 62073153.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
We declare that all the authors have approved the manuscript and agree with the submission to this journal. There are no conflicts of interest to declare.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A
Appendix A
Similar to [3], Theorem 1 can be proved as follows.
Recall that,
and the activation function f satisfies the Lipschitz condition and the Lipschitz coefficient \(L \le 1\).
So there are
Thus there are
Therefore, if \(\sigma_{\max } (W^{hm} )\sigma_{\max } (W^{mh} ) + \sigma_{\max } (W^{mm} ) < 1\) is true, then \(\mathop {\lim }\nolimits_{t \to \infty } \left\| {y^{t} } \right\|_{2} = 0\) holds for all right infinite input sequences \(u^{ + \infty } \in U^{ + \infty }\). That is, the MA-ESN model has the ESP.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Liu, Q., Li, F. & Wang, W. Memory augmented echo state network for time series prediction. Neural Comput & Applic 36, 3761–3776 (2024). https://doi.org/10.1007/s00521-023-09276-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-023-09276-4