Publication | Closed Access
Global optimization for neural network training
176
Citations
7
References
1996
Year
Artificial IntelligenceModel OptimizationLarge-scale Global OptimizationEngineeringMachine LearningData ScienceMachine Learning ModelNeural NetworkLarge Scale OptimizationComputer ScienceNonlinear OptimizationExternal LeadNeural Architecture SearchNeural Network Training
We propose a novel global minimization method, called NOVEL (Nonlinear Optimization via External Lead), and demonstrate its superior performance on neural network learning problems. The goal is improved learning of application problems that achieves either smaller networks or less error prone networks of the same size. This training method combines global and local searches to find a good local minimum. In benchmark comparisons against the best global optimization algorithms, it demonstrates superior performance improvement.
Page 1
Page 1