A new scheme for incremental learning

We present a new incremental procedure for supervised learning with noisy data. Each step consists in adding to the current network a new unit which is trained to learn the error of the network. The incremental step is repeated until the error of the current network can be considered as a noise. The stopping criterion is very simple and can be directly deduced from a statistical test on the estimated parameters of the new unit. First experimental results point out the efficacy of this new incremental scheme. Current works deal with theoretical analysis and practical refinements of the algorithm.