Publication | Closed Access
The computational intractability of training sigmoidal neural networks
39
Citations
9
References
1997
Year
Mathematical ProgrammingArtificial IntelligenceTarget FunctionEngineeringMachine LearningNeural NetworkInterpolation Training ProblemNonlinear OptimizationNonlinear ProgrammingApproximation TheoryComputational IntractabilityComputational Learning TheoryLarge Scale OptimizationComputer ScienceModel OptimizationEvolving Neural NetworkComputational NeuroscienceNeuronal NetworkBrain-like Computing
We demonstrate that the problem of approximately interpolating a target function by a neural network is computationally intractable. In particular the interpolation training problem for a neural network with two monotone Lipschitzian sigmoidal internal activation functions and one linear output node is shown to be NP-hard and NP-complete if the internal nodes are in addition piecewise ratios of polynomials. This partially answers a question of Blum and Rivest (1992) concerning the NP-completeness of training a logistic sigmoidal 3-node network. An extension of the result is then given for networks with n monotone sigmoidal internal nodes and one convex output node. This indicates that many multivariate nonlinear regression problems may be computationally infeasible.
| Year | Citations | |
|---|---|---|
Page 1
Page 1