Publication | Closed Access
Instance-Dependent Positive and Unlabeled Learning With Labeling Bias Estimation
34
Citations
31
References
2021
Year
Artificial IntelligenceMultiple Instance LearningEngineeringMachine LearningAdam Optimization TechniquesText MiningNatural Language ProcessingData SciencePattern RecognitionInstance-dependent Pu AlgorithmsSemi-supervised LearningStatisticsSupervised LearningInstance-based LearningComputational Learning TheoryPositive ExampleKnowledge DiscoveryComputer ScienceDeep LearningLabeling Bias Estimation
This paper studies instance-dependent Positive and Unlabeled (PU) classification, where whether a positive example will be labeled (indicated by s) is not only related to the class label y, but also depends on the observation x. Therefore, the labeling probability on positive examples is not uniform as previous works assumed, but is biased to some simple or critical data points. To depict the above dependency relationship, a graphical model is built in this paper which further leads to a maximization problem on the induced likelihood function regarding P(s,y|x). By utilizing the well-known EM and Adam optimization techniques, the labeling probability of any positive example P(s=1|y=1,x) as well as the classifier induced by P(y|x) can be acquired. Theoretically, we prove that the critical solution always exists, and is locally unique for linear model if some sufficient conditions are met. Moreover, we upper bound the generalization error for both linear logistic and non-linear network instantiations of our algorithm. Empirically, we compare our method with state-of-the-art instance-independent and instance-dependent PU algorithms on a wide range of synthetic, benchmark and real-world datasets, and the experimental results firmly demonstrate the advantage of the proposed method over the existing PU approaches.
| Year | Citations | |
|---|---|---|
Page 1
Page 1