Prediction error and consistent parameter area in neural learning
暂无分享,去创建一个
The more the number of training examples increases, the better a learning machine will behave. It is an important problem to know how fast and how well the behavior is improved. The average prediction error is one of the most popular criteria to see the behavior. From the geometrical point of view, a training example encircles a region in the parameter space in which the true parameter should be included. The set of parameters is called the consistent area when any machine in the set can explain the input-output relation of the given examples. It is dual to the convex hull of examples in the input signal space. We have studied the stochastic geometrical features of the convex hull and derived the upper and lower bounds of the average prediction error of the simple perceptron network.
[1] David Haussler,et al. Calculation of the learning curve of Bayes optimal classification algorithm for learning a perceptron with noise , 1991, COLT '91.
[2] Shun-ichi Amari,et al. Statistical Theory of Learning Curves under Entropic Loss Criterion , 1993, Neural Computation.