A neural network that embeds its own meta-levels

A recurrent neural network is presented which (in principle) can, besides learning to solve problems posed by the environment, also use its own weights as input data and learn new (arbitrarily complex) algorithms for modifying its own weights in response to the environmental input and evaluations. The network uses subsets of its input and output units for observing its own errors and for explicitly analysing and manipulating all of its own weights, including those weights responsible for analyzing and manipulating weights. This effectively embeds a chain of meta-networks and meta-meta-. . .-networks into the network itself.<<ETX>>