Publication | Closed Access
Multilayer perceptron and neural networks
590
Citations
2
References
2009
Year
Artificial IntelligenceIncremental LearningCognitive ScienceEngineeringMachine LearningLinear Inseparable ProblemsComputer EngineeringNeuronal NetworkGeneralized Delta AlgorithmNeural Architecture SearchComputer ScienceRobot LearningMultilayer PerceptronDeep LearningLearning ControlBrain-like ComputingRecurrent Neural Network
The attempts for solving linear inseparable problems have led to different variations on the number of layers of neurons and activation functions used. The backpropagation algorithm is the most known and used supervised learning algorithm. Also called the generalized delta algorithm because it expands the training way of the adaline network, it is based on minimizing the difference between the desired output and the actual output, through the downward gradient method (the gradient tells us how a function varies in different directions). Training a multilayer perceptron is often quite slow, requiring thousands or tens of thousands of epochs for complex problems. The best known methods to accelerate learning are: the momentum method and applying a variable learning rate. The paper presents the possibility to control the induction driving using neural systems.
| Year | Citations | |
|---|---|---|
Page 1
Page 1