Publication | Closed Access
Robust Transfer Metric Learning for Image Classification
174
Citations
41
References
2016
Year
Few-shot LearningEngineeringMachine LearningMetric LearningConventional Metric LearningRobust FeatureImage ClassificationImage AnalysisData SciencePattern RecognitionSemi-supervised LearningSupervised LearningMachine VisionKnowledge TransferKnowledge DiscoveryFeature TransformationComputer ScienceDeep LearningComputer VisionDomain AdaptationTransfer Learning
Metric learning has attracted increasing attention due to its critical role in image analysis and classification. Conventional metric learning always assumes that the training and test data are sampled from the same or similar distribution. However, to build an effective distance metric, we need abundant supervised knowledge (i.e., side/label information), which is generally inaccessible in practice, because of the expensive labeling cost. In this paper, we develop a robust transfer metric learning (RTML) framework to effectively assist the unlabeled target learning by transferring the knowledge from the well-labeled source domain. Specifically, RTML exploits knowledge transfer to mitigate the domain shift in two directions, i.e., sample space and feature space. In the sample space, domain-wise and class-wise adaption schemes are adopted to bridge the gap of marginal and conditional distribution disparities across two domains. In the feature space, our metric is built in a marginalized denoising fashion and low-rank constraint, which make it more robust to tackle noisy data in reality. Furthermore, we design an explicit rank constraint regularizer to replace the rank minimization NP-hard problem to guide the low-rank metric learning. Experimental results on several standard benchmarks demonstrate the effectiveness of our proposed RTML by comparing it with the state-of-the-art transfer learning and metric learning algorithms.
| Year | Citations | |
|---|---|---|
Page 1
Page 1