Regularization of Neural Networks Using DropConnect (Video + Paper)

0
5176

UCF Computer Vision Guest Speaker 2013
Guest Speaker: Dr. Rob Fergus
Instructor: Dr. Mubarak Shah (http://vision.eecs.ucf.edu/faculty/shah)

(Source: YouTube | UCF CRCV)

 

Regularization of Neural Networks using DropConnect

AUTHORS

Li Wan (wanli@cs.nyu.edu)
Matthew Zeiler (zeiler@cs.nyu.edu)
Sixin Zhang (zsx@cs.nyu.edu)
Yann LeCun (yann@cs.nyu.edu)
Rob Fergus (fergus@cs.nyu.edu)
Dept. of Computer Science, Courant Institute of Mathematical Science, New York University

ABSTRACT

We introduce DropConnect, a generalization of Dropout (Hinton et al., 2012), for regularizing large fully-connected layers within neural networks. When training with Dropout, a randomly selected subset of activations are set to zero within each layer. DropConnect instead sets a randomly selected subset of weights within the network to zero. Each unit thus receives input from a random subset of units in the previous layer. We derive a bound on the generalization performance of both Dropout and DropConnect. We then evaluate DropConnect on a range of datasets, comparing to Dropout, and show state-of-the-art results on several image recognition benchmarks by aggregating multiple DropConnect-trained models.

FULL TEXT

 (Source: New York University | Computer Science)

Leave a Reply