Statistical Neurodynamics of Deep Networks: Geometry of Signal Spaces

Statistical neurodynamics studies macroscopic behaviors of randomly connected neural networks. We consider a deep layered feedforward network where input signals are processed layer by layer. The manifold of input signals is embedded in a higher dimensional manifold of the next layer as a curved submanifold, provided the number of neurons is larger than that of inputs. We show geometrical features of the embedded manifold, proving that the manifold enlarges or shrinks locally isotropically so that it is always embedded conformally. We study the curvature of the embedded manifold. The scalar curvature converges to a constant or diverges to infinity slowly. The distance between two signals also changes, converging eventually to a stable fixed value, provided both the number of neurons in a layer and the number of layers tend to infinity. This causes a problem, since when we consider a curve in the input space, it is mapped as a continuous curve of fractal nature, but our theory contradictorily suggests that the curve eventually converges to a discrete set of equally spaced points. In reality, the numbers of neurons and layers are finite and thus, it is expected that the finite size effect causes the discrepancies between our theory and reality. We need to further study the discrepancies to understand their implications on information processing.

[1]  Surya Ganguli,et al.  Deep Information Propagation , 2016, ICLR.

[2]  Surya Ganguli,et al.  Exponential expressivity in deep neural networks through transient chaos , 2016, NIPS.

[3]  S. Amari,et al.  A Mathematical Foundation for Statistical Neurodynamics , 1977 .

[4]  Shun-ichi Amari,et al.  Fisher Information and Natural Gradient Learning of Random Deep Networks , 2018, AISTATS.

[5]  S. Amari,et al.  State concentration exponent as a measure of quickness in Kauffman-type networks. , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[6]  Taro Toyoizumi,et al.  Structure of attractors in randomly connected networks. , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[7]  Shun-ichi Amari,et al.  A method of statistical neurodynamics , 1974, Kybernetik.

[8]  Samuel S. Schoenholz,et al.  Mean Field Residual Networks: On the Edge of Chaos , 2017, NIPS.

[9]  Sommers,et al.  Chaos in random neural networks. , 1988, Physical review letters.