Evolving predictors for chaotic time series

Neural networks are a popular representation for inducing single-step predictors for chaotic times series. For complex time series it is often the case that a large number of hidden units must be used to reliably acquire appropriate predictors. This paper describes an evolutionary method that evolves a class of dynamic systems with a form similar to neural networks but requiring fewer computational units. Results for experiments on two popular chaotic times series are described and the current method's performance is shown to compare favorably with using larger neural networks.

[1]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[2]  Peter J. Angeline,et al.  Genetic programming and emergent intelligence , 1994 .

[3]  Byoung-Tak Zhang,et al.  Evolutionary Induction of Sparse Neural Trees , 1997, Evolutionary Computation.

[4]  Peter J. Angeline,et al.  An evolutionary algorithm that constructs recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[5]  Edgar E. Peters Fractal Market Analysis: Applying Chaos Theory to Investment and Economics , 1994 .

[6]  Lawrence J. Fogel,et al.  Artificial Intelligence through Simulated Evolution , 1966 .

[7]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .

[8]  Thomas Bäck,et al.  Evolutionary computation: Toward a new philosophy of machine intelligence , 1997, Complex..

[9]  Gene A. Tagliarini,et al.  Ensemble encoding for time series forecasting with MLP networks , 1997, Defense, Security, and Sensing.

[10]  David B. Fogel,et al.  Evolutionary computation - toward a new philosophy of machine intelligence (3. ed.) , 1995 .


[12]  David B. Fogel,et al.  Evolutionary Computation: Towards a New Philosophy of Machine Intelligence , 1995 .

[13]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .