Publication | Open Access
Towards Understanding the Role of Over-Parametrization in Generalization of Neural Networks
213
Citations
14
References
2018
Year
Artificial IntelligenceConvolutional Neural NetworkEngineeringMachine LearningComputational ComplexitySparse Neural NetworkNeural Scaling LawComputational Learning TheoryLower BoundComputer EngineeringComputer ScienceNeural NetworksDeep LearningNeural Architecture SearchModel CompressionEvolving Neural NetworkComputational NeuroscienceParameter TuningNeuronal NetworkTighter Generalization Bound
Despite existing work on ensuring generalization of neural networks in terms of scale sensitive complexity measures, such as norms, margin and sharpness, these complexity measures do not offer an explanation of why neural networks generalize better with over-parametrization. In this work we suggest a novel complexity measure based on unit-wise capacities resulting in a tighter generalization bound for two layer ReLU networks. Our capacity bound correlates with the behavior of test error with increasing network sizes, and could potentially explain the improvement in generalization with over-parametrization. We further present a matching lower bound for the Rademacher complexity that improves over previous capacity lower bounds for neural networks.
| Year | Citations | |
|---|---|---|
Page 1
Page 1