Neural trees: a new tool for classification

The authors propose a new classifier based on neural network techniques. The ‘network’ consists of a set of perceptrons functionally organized in a binary tree (‘neural tree’). The learning algorithm is inspired from a growth algorithm, the tiling algorithm, recently introduced for feedforward neural networks. As in the former case, this is a constructive algorithm, for which convergence is guaranteed. In the neural tree one distinguishes the structural organization from the functional organization: each neuron of a neural tree receives inputs from, and only from, the input layer; its output does not feed into any other neuron, but is used to propagate down a decision tree. The main advantage of this approach is due to the local processing in restricted portions of input space, during both learning and classification. Moreover, only a small subset of neurons have to be updated during the classification stage. Finally, this approach is easily and efficiently extended to classification in a multiclass probl...

[1]  Dennis K. Branstad Considerations for security in the OSI architecture , 1987, IEEE Network.

[2]  Mario Marchand,et al.  Learning by Minimizing Resources in Neural Networks , 1989, Complex Syst..

[3]  Gérard Dreyfus,et al.  Single-layer learning revisited: a stepwise procedure for building and training a neural network , 1989, NATO Neurocomputing.

[4]  Daniel Gross Induction and ID/3: more powerful than we think , 1988 .

[5]  Richard P. Lippmann,et al.  An introduction to computing with neural nets , 1987 .

[6]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[7]  F. Vallet,et al.  Linear and nonlinear extension of the pseudo-inverse solution for learning boolean functions , 1989 .

[8]  Terrence J. Sejnowski,et al.  Parallel Networks that Learn to Pronounce English Text , 1987, Complex Syst..

[9]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[10]  J J Hopfield,et al.  Learning algorithms and probability distributions in feed-forward and feed-back networks. , 1987, Proceedings of the National Academy of Sciences of the United States of America.

[11]  M. Duranton,et al.  Learning on VLSI: a general purpose digital neurochip , 1989, International 1989 Joint Conference on Neural Networks.

[12]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[13]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[14]  Stephen M. Omohundro,et al.  Efficient Algorithms with Neural Network Behavior , 1987, Complex Syst..

[15]  Jean-Pierre Nadal,et al.  Information storage in sparsely coded memory nets , 1990 .

[16]  M. Golea,et al.  A Growth Algorithm for Neural Network Decision Trees , 1990 .

[17]  H. J. Schmitz,et al.  Fast recognition of real objects by an optimized hetero-associative neural network , 1990 .

[18]  D. Huffman A Method for the Construction of Minimum-Redundancy Codes , 1952 .

[19]  Erkki Oja,et al.  Neural Networks, Principal Components, and Subspaces , 1989, Int. J. Neural Syst..

[20]  Jean-Pierre Nadal,et al.  Study of a Growth Algorithm for a Feedforward Network , 1989, Int. J. Neural Syst..

[21]  Geoffrey E. Hinton,et al.  Learning and relearning in Boltzmann machines , 1986 .

[22]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[23]  Peter Seitz,et al.  Minimum class entropy: A maximum information approach to layered networks , 1989, Neural Networks.

[24]  J. Nadal,et al.  Learning in feedforward layered networks: the tiling algorithm , 1989 .

[25]  L. Rabiner,et al.  An introduction to hidden Markov models , 1986, IEEE ASSP Magazine.

[26]  Ralph Linsker,et al.  Self-organization in a perceptual network , 1988, Computer.