Publication | Closed Access
Regularization of Neural Networks using DropConnect
1.9K
Citations
10
References
2013
Year
Dropout randomly zeroes activations in each layer during training. The paper proposes DropConnect, a Dropout generalization for regularizing large fully‑connected layers in neural networks. DropConnect zeroes random weights, causing each unit to receive input from a random subset of previous units, and the authors derive a generalization bound for both Dropout and DropConnect. DropConnect achieves state‑of‑the‑art performance on multiple image‑recognition benchmarks, outperforming Dropout when multiple models are aggregated.
We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.
| Year | Citations | |
|---|---|---|
Page 1
Page 1