Using the symmetries of a multi-layered network to reduce the weight space
暂无分享,去创建一个
The results are presented of a theoretical study of multilayered neural networks carried out using a general formalism describing such networks, forward pass, backpropagation and coherent transformations. A weight space reducing method using sign and permutation transformations was developed. After remarking that certain network modifications (notably any permutation of two units in the same layer) have no effect on the global transfer function, the authors formalize and generalize this observation. Then, they demonstrate that this result could be used to reduce the search for solutions to a restricted part of the weighted space. Finally, a learning algorithm inspired by simulated annealing has made it possible to test the method.<<ETX>>
[1] R.J.F. Dow,et al. Neural net pruning-why and how , 1988, IEEE 1988 International Conference on Neural Networks.
[2] Hecht-Nielsen. Theory of the backpropagation neural network , 1989 .