Publication | Closed Access
A scalable dual approach to semidefinite metric learning
26
Citations
19
References
2011
Year
Unknown Venue
Mathematical ProgrammingEngineeringMachine LearningScalable Dual ApproachOther Sdp ProblemsSemidefinite ProgrammingFunctional AnalysisData SciencePattern RecognitionSemi-supervised LearningSupervised LearningLinear OptimizationMachine VisionKnowledge DiscoveryComputer ScienceDimensionality ReductionComputer VisionConvex OptimizationMaximum Variance UnfoldingDistance Metric Learning
Distance metric learning plays an important role in many vision problems. Previous work of quadratic Mahalanobis metric learning usually needs to solve a semidefinite programming (SDP) problem. A standard interior-point SDP solver has a complexity of O(D <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">6.5</sup> ) (with D the dimension of input data), and can only solve problems up to a few thousand variables. Since the number of variables is D(D + l)/2, this corresponds to a limit around D <; 100. This high complexity hampers the application of metric learning to high-dimensional problems. In this work, we propose a very efficient approach to this metric learning problem. We formulate a Lagrange dual approach which is much simpler to optimize, and we can solve much larger Mahalanobis metric learning problems. Roughly, the proposed approach has a time complexity of O(t · D <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> ) with t ≈ 20 ~ 30 for most problems in our experiments. The proposed algorithm is scalable and easy to implement. Experiments on various datasets show its similar accuracy compared with state-of-the-art. We also demonstrate that this idea may also be able to be applied to other SDP problems such as maximum variance unfolding.
| Year | Citations | |
|---|---|---|
Page 1
Page 1