Generalization in partially connected layered neural networks

We study the learning from examples in a partially connected single layer perceptron and a two-layer network. Partially connected student networks learn from fully connected teacher networks. We study the generalization in the annealed approximation. We consider a single layer perceptron with binary weights. When a student is weakly diluted, there is a first order phase transition from the poor learning to the good learning state similar to that of fully connected perceptron. With a strong dilution, the first order phase transition disappears and the generalization error decreases continuously. We also study learning of a two-layer committee machine with binary weights. Contrary to the perceptron learning, there always exist a first order transition irrespective of dilution. The permutation symmetry is broken at the transition point and the generalization error is reduced to a non-zero minimum value.