Publication | Closed Access
Tighter and convex maximum margin clustering
112
Citations
16
References
2009
Year
Unknown Venue
Maximum margin principle has been suc-cessfully applied to many supervised and semi-supervised problems in machine learn-ing. Recently, this principle was extended for clustering, referred to as Maximum Mar-gin Clustering (MMC) and achieved promis-ing performance in recent studies. To avoid the problem of local minima, MMC can be solved globally via convex semi-definite programming (SDP) relaxation. Although many efficient approaches have been pro-posed to alleviate the computational bur-den of SDP, convex MMCs are still not scal-able for medium data sets. In this pa-per, we propose a novel convex optimiza-tion method, LG-MMC, which maximizes the margin of opposite clusters via “Label Gen-eration”. It can be shown that LG-MMC is much more scalable than existing convex ap-proaches. Moreover, we show that our con-vex relaxation is tighter than state-of-art con-vex MMCs. Experiments on seventeen UCI datasets and MNIST dataset show significant improvement over existing MMC algorithms. 1
| Year | Citations | |
|---|---|---|
Page 1
Page 1