The N-2-N Encoder: A Matter of Representation

Kruglyak [4] demonstrated that weights exist to implement the N-2-N encoder for all finite N, hut it is known that Backpropagation (BP) is unusually poor at learning such solutions for N > 8 ([6]). We show that the learning problem lies not with BP, but with the pattern representation typically used in the encoder task. With an appropriate representation, we demonstrate that BP can learn encoders in approximately linear time with N as large as 100. This underlines yet again the crucial importance of pattern representation in neural network learning.