Publication | Closed Access
A new error criterion for posterior probability estimation with neural nets
81
Citations
5
References
1990
Year
Unknown Venue
Artificial IntelligenceBayesian StatisticEngineeringMachine LearningNew Error CriterionBayesian InferenceError CriterionData ScienceUncertainty QuantificationPattern RecognitionPosterior Probability EstimationStatisticsSupervised LearningComputational Learning TheoryMachine Learning ModelBayesian NetworkComputer ScienceStatistical Learning TheoryDeep LearningLeast SquaresNeural NetsStatistical InferenceClassifier System
The authors introduce an error criterion for training which improves the performance of neural nets as posterior probability estimators, as compared to using least squares. The proposed criterion is similar to the Kullback-Leibler information measure and is simple to use. A straightforward iterative algorithm for the minimization of the error criterion which has been shown to have good convergence properties is described. The authors applied the proposed technique to some classification examples and showed it to produce better posterior probability estimates than least squares, especially for low probabilities
| Year | Citations | |
|---|---|---|
Page 1
Page 1