Concepedia

TLDR

Restricted Boltzmann machines were originally built with binary stochastic hidden units, and the learning and inference rules for the proposed Stepped Sigmoid Units remain unchanged. The authors generalize each binary unit to an infinite set of copies with identical weights but progressively more negative biases, which can be efficiently approximated by noisy rectified linear units. These rectified linear units outperform binary units by learning features that improve object recognition on NORB and face verification on Labeled Faces in the Wild, and they preserve relative intensity information across multiple feature layers.

Abstract

Restricted Boltzmann machines were developed using binary stochastic hidden units. These can be generalized by replacing each binary unit by an infinite number of copies that all have the same weights but have progressively more negative biases. The learning and inference rules for these Stepped Sigmoid Units are unchanged. They can be approximated efficiently by noisy, rectified linear units. Compared with binary units, these units learn features that are better for object recognition on the NORB dataset and face verification on the Labeled Faces in the Wild dataset. Unlike binary units, rectified linear units preserve information about relative intensities as information travels through multiple layers of feature detectors.

References

YearCitations

2006

20.5K

2006

16.2K

2002

4.9K

2008

4.5K

2005

3.9K

2009

2.2K

2007

1.9K

2009

1.4K

2004

1.3K

2007

1K

Page 1