Abstract:

Reservoir Computing is a paradigm of training Recurrent Neural Networks based on treating the recurrent part (the so-called “reservoir”) differently from the readouts. This paradigm has become so popular recently due to its computational efficiency and the fact that it’s enough to train only a supervised readout. Meanwhile Evolving Systems define a new approach which focuses on learning fuzzy systems that have both their parameters and their structure adapting on-line. In this paper an evolving reservoir neo-fuzzy network is built using time delay elements and nonlinear neo-fuzzy synapses which means that Reservoir Computing, Evolving Systems and Soft Computing are combined in a new computational system

References:

1. Lughofer E. On-line incremental feature weighting in evolving fuzzy classifiers // Fuzzy Sets and Systems. – 163(1). – 2011. – P. 1-23. 2. Lughofer E. Evolving Fuzzy Systems – Methodologies, Advanced Concepts and Applications // Studies in Fuzziness and Soft Computing. – Springer, 2011. – 410p. 3. Angelov P., Lughofer E., Zhou X. Evolving fuzzy classifiers using different model architectures // Fuzzy Sets and Systems. – 159(23). – 2008. – P. 3160-3182. 4. Baruah R.D., Angelov P. Evolving fuzzy systems for data streams: a survey // Wiley Interdisc. Rew.: Data Mining and Knowledge Discovery. – 1(6). – 2011. – P. 461-476. 5. Angelov P., Zhou X. Evolving Fuzzy-Rule-Based Classifiers From Data Streams // IEEE Fuzzy Systems. – 16(6). – 2008. – P. 1462-1475. 6. Angelov P., Zhou X. Evolving Fuzzy Classifier for Novelty Detection and Landmark Recognition by Mobile Robots // Mobile Robots. – 2007. – P. 89-118. 7. Angelov P. A fuzzy controller with evolving structure // Information Science. – 161(1-2). – 2004. – P. 21-35. 8. Kasabov N. Evolving Connectionist Systems: The Knowledge Engineering Approach. – Springer London, 2007. – 451p. 9. Alexandre L.A., Embrechts M.J. Reservoir size, spectral radius and connectivity in static classification problems // ICANN 2009, Part I, LNCS 5768. – Springer-Verlag, Berlin Heidelberg, 2009. – Р. 1015-1024. 10. Jaeger H. The echo-state approach to analysing and training recurrent neural networks // Technical report, German National Research Centre for Information Technology. – 2001. – 45р. 11. Jaeger H. Short-term memory in echo state networks // Technical report, German National esearch Centre for Information Technology. – 2001. – 36р. 12. Rodan A., Tino P. Minimum complexity echo state network // IEEE Transactions on Neural Networks. – 22(1). – 2011. – P. 131–144. 13. Miki T., Yamakawa T. Analog implementation of neo-fuzzy neuron and its on-board learning // Computational Intelligence and Applications. – Piraeus: WSES Press, 1999. – P. 144-149. 14. Uchino E., Yamakawa T. Soft computing based signal prediction, restoration and filtering // Intelligent Hybrid Systems: Fuzzy Logic, Neural Networks and Genetic Algorithms. – Boston: Kluwer Academic Publisher, 1997. – P. 331-349. 15. Yamakawa T., Uchino E., Miki T., H. Kusanagi. A neo fuzzy neuron and its applications to system identification and prediction of the system behavior // Proc. 2-nd Int.Conf. on Fuzzy Logic and Neural Networks “IIZUKA-92”. – Iizuka, Japan, 1992. –P. 477-483. 16. Bodyanskiy Ye., Tyshchenko O. Neo-fuzzy forecasting echo state network // Proc. 4th Int. Conf. ACSN-2009. – Lviv: NVF «Ukrainski Tehnologii», 2009. – P. 95-96. 17. Bodyanskiy Ye., Tyshchenko O. The reservoir predictive neuro-fuzzy network // Proc. Int. Conf. on Intellectual Systems for Decision Making and Problems of Computational Intelligence. – Vol.1. – Kherson: KhNTU, 2010. – P. 279-282. 18. Tyshchenko O., Pliss I. The forecasting neuro-neo-fuzzy network based on reservoir computing // Control Systems, Navigation and Connection Systems. – No.1(21), Vol.2. – Kyiv, 2012. – P. 123-126.