Correction to Lower Bounds on VC-Dimension of Smoothly Parameterized Function Classes1
暂无分享,去创建一个
The earlier article gives lower bounds on the VC-dimension of various smoothly parameterized function classes. The results were proved by showing a relationship between the uniqueness of decision boundaries and the VC-dimension of smoothly parameterized function classes. The proof is incorrect; there is no such relationship under the conditions stated in the article. For the case of neural networks with tanh activation functions, we give an alternative proof of a lower bound for the VC-dimension proportional to the number of parameters, which holds even when the magnitude of the parameters is restricted to be arbitrarily small.
[1] Peter L. Bartlett,et al. Lower bounds on the VC-dimension of smoothly parametrized function classes , 1994, COLT '94.
[2] P. Bartlett,et al. Exponential convergence of a gradient descent algorithm for a class of recurrent neural networks , 1995, 38th Midwest Symposium on Circuits and Systems. Proceedings.
[3] Eduardo D. Sontag,et al. UNIQUENESS OF WEIGHTS FOR NEURAL NETWORKS , 1993 .
[4] Scott Petrack,et al. Lower Bound on VC-Dimension by Local Shattering , 1997, Neural Computation.