Optimizing simple deterministically constructed cycle reservoir network with a Redundant Unit Pruning Auto-Encoder algorithm

Abstract Echo State Network (ESN) is a specific form of recurrent neural network, which displays very rich dynamics owing to its reservoir based hidden neurons. In the issue, ESN is viewed as a powerful approach to model real-valued time series processes. Nevertheless, ESN has been criticized for its manually experienced or brute-force searching parameters, such as initial input weights and reservoir layer weights, i.e., the conventional randomly generated ESN is unlikely to be optimal because the reservoir layer weights and input layer weights are created randomly. Simple Cycle Reservoir Network (SCRN), which constitutes a type of conclusively constructed input and internal layer weights, can yield performance comparable with conventional ESN. A Redundant Unit Pruning Auto-Encoder (RUP-AE) algorithm is proposed to optimize the input layer weights of SCRN and for resolving the dilemma of ill-conditioned output weights matrix in SCRN, through an unsupervised pre-training process. Initially, the output weights matrix of SCRN is pre-trained by pseudo-inverse algorithm through training data. Then, the pre-trained output weights matrix is pruned by a Redundant Unit Pruning (RUP) algorithm. Finally, the pruned output weights matrix of SCRN is injected to the input weights matrix to ensure the specificity of the auto-encoder. Three tasks, namely nonlinear time series system identification task, real-valued time series benchmark, and standard chaotic time series benchmark, are applied to demonstrate the advantage and superiority of RUP-AE. Extensive experimental results show that our RUP-AE is effective in improving the performance of SCRN. Meanwhile, RUP-AE is able to resolve the dilemma of ill-conditioned output weights matrix in SCRN.

[1]  Jun Wang,et al.  Model Predictive Control of Unknown Nonlinear Dynamical Systems Based on Recurrent Neural Networks , 2012, IEEE Transactions on Industrial Electronics.

[2]  Xuefeng Yan,et al.  Improved simple deterministically constructed Cycle Reservoir Network with Sensitive Iterative Pruning Algorithm , 2014, Neurocomputing.

[3]  José Carlos Príncipe,et al.  Analysis and Design of Echo State Networks , 2007, Neural Computation.

[4]  Jun Yu,et al.  Local Deep-Feature Alignment for Unsupervised Dimension Reduction , 2018, IEEE Transactions on Image Processing.

[5]  Jun Yu,et al.  Multitask Autoencoder Model for Recovering Human Poses , 2018, IEEE Transactions on Industrial Electronics.

[6]  Pattreeya Tanisaro,et al.  Time Series Classification Using Time Warping Invariant Echo State Networks , 2016, 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA).

[7]  Shan Liu,et al.  An effective multivariate time series classification approach using echo state network and adaptive differential evolution algorithm , 2016, Expert Syst. Appl..

[8]  Xuefeng Yan,et al.  Optimizing the echo state network with a binary particle swarm optimization algorithm , 2015, Knowl. Based Syst..

[9]  Peter Tiño,et al.  Minimum Complexity Echo State Network , 2011, IEEE Transactions on Neural Networks.

[10]  Jiabin Wang,et al.  Functional echo state network for time series classification , 2016, Inf. Sci..

[11]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[12]  Zhidong Deng,et al.  Deep self-organizing reservoir computing model for visual object recognition , 2016, 2016 International Joint Conference on Neural Networks (IJCNN).

[13]  Min Han,et al.  Support Vector Echo-State Machine for Chaotic Time-Series Prediction , 2007, IEEE Transactions on Neural Networks.

[14]  Junfei Qiao,et al.  Growing Echo-State Network With Multiple Subreservoirs , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[15]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .

[16]  Adel M. Alimi,et al.  PSO-based analysis of Echo State Network parameters for time series forecasting , 2017, Appl. Soft Comput..

[17]  Florian Metze,et al.  Extracting deep bottleneck features using stacked auto-encoders , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[18]  Herbert Jaeger,et al.  Reservoir computing approaches to recurrent neural network training , 2009, Comput. Sci. Rev..

[19]  Danilo P. Mandic,et al.  An Augmented Echo State Network for Nonlinear Adaptive Filtering of Complex Noncircular Signals , 2011, IEEE Transactions on Neural Networks.

[20]  Amy Loutfi,et al.  Learning Feature Representations with a Cost-Relevant Sparse Autoencoder , 2015, Int. J. Neural Syst..

[21]  LinLin Shen,et al.  Improving variational autoencoder with deep feature consistent and generative adversarial training , 2019, Neurocomputing.

[22]  Jochen J. Steil,et al.  Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning , 2007, Neural Networks.

[23]  Benjamin Schrauwen,et al.  An experimental unification of reservoir computing methods , 2007, Neural Networks.

[24]  Yong Dou,et al.  An efficient and effective convolutional auto-encoder extreme learning machine network for 3d feature learning , 2016, Neurocomputing.

[25]  Herbert Jaeger,et al.  Adaptive Nonlinear System Identification with Echo State Networks , 2002, NIPS.

[26]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[27]  Davide Cozzolino,et al.  Autoencoder with recurrent neural networks for video forgery detection , 2017, Media Watermarking, Security, and Forensics.

[28]  Xue-feng Yan,et al.  Reservoir Computing with Sensitivity Analysis Input Scaling Regulation and Redundant Unit Pruning for Modeling Fed-Batch Bioprocesses , 2014 .

[29]  Zhihao Shang,et al.  A novel combined model based on echo state network for multi-step ahead wind speed forecasting: A case study of NREL , 2019, Energy Conversion and Management.

[30]  Meng Wang,et al.  Multimodal Deep Autoencoder for Human Pose Recovery , 2015, IEEE Transactions on Image Processing.

[31]  Minoru Asada,et al.  A small-world topology enhances the echo state property and signal propagation in reservoir computing , 2019, Neural Networks.

[32]  Qingming Huang,et al.  Multi-modal semantic autoencoder for cross-modal retrieval , 2019, Neurocomputing.

[33]  Alessandro Sperduti,et al.  Pre-training of Recurrent Neural Networks via Linear Autoencoders , 2014, NIPS.

[34]  Jian Zhou,et al.  Echo State Network with Multiple Loops Reservoir and its Application in Network Traffic Prediction , 2018, 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design ((CSCWD)).

[35]  John G. Harris,et al.  Automatic speech recognition using a predictive echo state network classifier , 2007, Neural Networks.

[36]  Caihong Li,et al.  Multi-steps prediction of chaotic time series based on echo state network , 2010, 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA).