Concepedia

Publication | Closed Access

A new error criterion for posterior probability estimation with neural nets

81

Citations

5

References

1990

Year

Abstract

The authors introduce an error criterion for training which improves the performance of neural nets as posterior probability estimators, as compared to using least squares. The proposed criterion is similar to the Kullback-Leibler information measure and is simple to use. A straightforward iterative algorithm for the minimization of the error criterion which has been shown to have good convergence properties is described. The authors applied the proposed technique to some classification examples and showed it to produce better posterior probability estimates than least squares, especially for low probabilities

References

YearCitations

Page 1