Publication | Closed Access
A probabilistic approach to the understanding and training of neural network classifiers
173
Citations
5
References
2002
Year
EngineeringMachine LearningNeural Networks (Machine Learning)Neural NetworkNetwork AnalysisClassification MethodData ScienceData MiningPattern RecognitionManagementNeural Network ClassifiersSupervised LearningInformation TheoryAutomatic ClassificationNetworksNetwork EstimationKnowledge DiscoveryBayesian NetworkIntelligent ClassificationComputer ScienceNeural Networks (Computational Neuroscience)Posterior Class ProbabilitiesStatistical Learning TheoryBayesian NetworksData ClassificationNetwork ScienceProbabilistic ApproachClassificationClassifier SystemMaximum Mutual Information
It is shown that training a neural network using a mean-square-error criterion gives network outputs that approximate posterior class probabilities. Based on this probabilistic interpretation of the network operation, information-theoretic training criteria such as maximum mutual information and the Kullback-Liebler measure are investigated. It is shown that both of these criteria are equivalent to the maximum-likelihood estimation (MLE) of the network parameters. MLE of a network allows for the comparison of network models using the Akaike information criterion and the minimum-description length criterion.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
| Year | Citations | |
|---|---|---|
Page 1
Page 1