Concepedia

Abstract

The authors investigate the training of a layered perceptron with jittered data. They study the effect of generating additional training data by adding noise to the input data and show that is introduces convolutional smoothing of the target function. Training using such jittered data is shown, under a small variance assumption, to be equivalent to Lagrangian regularization with a derivative regularizer. Training with jitter allows regularization within the conventional layered perceptron architecture.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

References

YearCitations

Page 1