Publication | Closed Access
Neural Networks and the Bias/Variance Dilemma
3.5K
Citations
83
References
1992
Year
Artificial IntelligenceEngineeringMachine LearningAi FoundationFeedforward Neural NetworksRecurrent Neural NetworkData SciencePattern RecognitionSparse Neural NetworkSupervised LearningCognitive ScienceMachine VisionComputational Learning TheoryMachine Learning ModelComputer ScienceNeural NetworksDeep LearningMedical Image ComputingNonparametric InferenceEvolving Neural NetworkNonparametric Regression Estimators
Feedforward neural networks trained by backpropagation serve as nonparametric regression estimators. The paper offers a tutorial linking nonparametric inference to neural networks, using a statistical perspective to expose their strengths and weaknesses. The authors illustrate concepts with recognition experiments on artificial data and handwritten numerals. They conclude that modern feedforward neural networks are inadequate for challenging perception and learning tasks, and that representation—not learning—is the core issue, as supported by handwritten numeral experiments.
Feedforward neural networks trained by error backpropagation are examples of nonparametric regression estimators. We present a tutorial on nonparametric inference and its relation to neural networks, and we use the statistical viewpoint to highlight strengths and weaknesses of neural models. We illustrate the main points with some recognition experiments involving artificial data as well as handwritten numerals. In way of conclusion, we suggest that current-generation feedforward neural networks are largely inadequate for difficult problems in machine perception and machine learning, regardless of parallel-versus-serial hardware or other implementation issues. Furthermore, we suggest that the fundamental challenges in neural modeling are about representation rather than learning per se. This last point is supported by additional experiments with handwritten numerals.
| Year | Citations | |
|---|---|---|
Page 1
Page 1