Publication | Closed Access
First-order transition to perfect generalization in a neural network with binary synapses
113
Citations
12
References
1990
Year
EngineeringMachine LearningNeural NetworkBinary SynapsesGeneralization ErrorSocial SciencesSparse Neural NetworkConnectionismNeural Scaling LawPerturbation MethodPhysicsReference PerceptronComputer ScienceNeural Architecture SearchBinary Synaptic ParametersComputational ScienceEvolving Neural NetworkComputational NeuroscienceNeuronal NetworkNeuroscienceFirst-order TransitionBrain-like Computing
Learning from examples by a perceptron with binary synaptic parameters is studied. The examples are given by a reference (teacher) perceptron. It is shown that as the number of examples increases, the network undergoes a first-order transition, where it freezes into the state of the reference perceptron. When the transition point is approached from below, the generalization error reaches a minimal positive value, while above that point the error is constantly zero. The transition is found to occur at ${\mathrm{\ensuremath{\alpha}}}_{\mathrm{GD}}$=1.245 examples per coupling [E. Gardner and B. Derrida, J. Phys. A 22, 1983 (1989)].
| Year | Citations | |
|---|---|---|
Page 1
Page 1