Classification with neural networks: a performance analysis

A performance analysis is presented for the most popular neural network classifier, the multilayer perceptron (MLP). The analysis is performed for a specific class of pattern recognition problems called one-class classifier problems. The criteria used to measure the performance are classification error, computational complexity (measured in terms of the network size), sensitivity to network size selection, and number of training samples required. With regard to the network size it is shown that networks with one hidden layer perform better than those with two hidden layers. Further, a lower bound on the number of nodes in the hidden layer is derived and found to be d+1, where d is the dimension of the data patterns. The optimal number of nodes is shown to be somewhat larger than this (approximately 3d). In addition the network performance is shown to be relatively insensitive to overspecification of the network size. Finally it is shown that for near-optimal performance the number of training samples should be approximately 60d(d+1).<<ETX>>

[1]  B. Irie,et al.  Capabilities of three-layered perceptrons , 1988, IEEE 1988 International Conference on Neural Networks.

[2]  David Haussler,et al.  What Size Net Gives Valid Generalization? , 1989, Neural Computation.

[3]  Richard P. Lippmann,et al.  An introduction to computing with neural nets , 1987 .

[4]  R.J.F. Dow,et al.  Neural net pruning-why and how , 1988, IEEE 1988 International Conference on Neural Networks.

[5]  John S. Denker,et al.  Network Generality, Training Required, and Precision Required , 1987, NIPS.