Concepedia

Publication | Closed Access

First-order transition to perfect generalization in a neural network with binary synapses

113

Citations

12

References

1990

Year

Abstract

Learning from examples by a perceptron with binary synaptic parameters is studied. The examples are given by a reference (teacher) perceptron. It is shown that as the number of examples increases, the network undergoes a first-order transition, where it freezes into the state of the reference perceptron. When the transition point is approached from below, the generalization error reaches a minimal positive value, while above that point the error is constantly zero. The transition is found to occur at ${\mathrm{\ensuremath{\alpha}}}_{\mathrm{GD}}$=1.245 examples per coupling [E. Gardner and B. Derrida, J. Phys. A 22, 1983 (1989)].

References

YearCitations

Page 1