Concepedia

Publication | Closed Access

Deterministic Convergence of an Online Gradient Method for BP Neural Networks

157

Citations

21

References

2005

Year

Abstract

Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.

References

YearCitations

Page 1