Publication | Open Access
Generalized Information-Theoretic Criterion for Multi-Label Feature Selection
16
Citations
31
References
2019
Year
Multiple Instance LearningEngineeringMachine LearningFeature SelectionText MiningClassification MethodData ScienceData MiningPattern RecognitionMulti-label Feature SelectionBiostatisticsStatisticsFeature EngineeringPredictive AnalyticsKnowledge DiscoveryStatistical Learning TheoryFeature ConstructionCertain Approximation MethodJoint EntropyStatistical Inference
Multi-label feature selection that identifies important features from the original feature set of multi-labeled datasets has been attracting considerable attention owing to its generality compared to conventional single-label feature selection. The unimportant features are filtered by scoring the dependency of features to labels. In conventional multi-label feature filter studies, the score function is obtained by approximating a dependency measure such as joint entropy because direct calculation is often impractical due to the presence of multiple labels with limited training patterns. Although the efficacy of approximation can differ depending on the characteristics of the multi-label dataset, conventional methods presume a certain approximation method, leading to a degenerated feature subset if the presumed approximation is inappropriate for the given dataset. In this study, we propose a strategy for selecting an approximation among a series of approximations depending on the number of involved variables and consequently instantiate a score function based on the chosen approximation. The experimental results demonstrate that the proposed strategy and score function outperform conventional multi-label feature selection methods.
| Year | Citations | |
|---|---|---|
Page 1
Page 1