Publication | Closed Access
Kernel K-Means Sampling for Nyström Approximation
63
Citations
22
References
2018
Year
EngineeringMachine LearningNyström-based KernelImage AnalysisData SciencePattern RecognitionKernel K-means SamplingApproximation TheoryStatisticsLow-rank ApproximationMachine VisionManifold LearningSampling TheoryComputer ScienceDimensionality ReductionPolynomial KernelComputer VisionReproducing Kernel MethodStatistical InferenceMatrix Approximation ErrorKernel Method
A fundamental problem in Nyström-based kernel matrix approximation is the sampling method by which training set is built. In this paper, we suggest to use kernel -means sampling, which is shown in our works to minimize the upper bound of a matrix approximation error. We first propose a unified kernel matrix approximation framework, which is able to describe most existing Nyström approximations under many popular kernels, including Gaussian kernel and polynomial kernel. We then show that, the matrix approximation error upper bound, in terms of the Frobenius norm, is equal to the -means error of data points in kernel space plus a constant. Thus, the -means centers of data in kernel space, or the kernel -means centers, are the optimal representative points with respect to the Frobenius norm error upper bound. Experimental results, with both Gaussian kernel and polynomial kernel, on real-world data sets and image segmentation tasks show the superiority of the proposed method over the state-of-the-art methods.
| Year | Citations | |
|---|---|---|
2000 | 15.5K | |
1998 | 8K | |
2002 | 7.9K | |
1985 | 7.4K | |
1997 | 908 | |
2000 | 596 | |
2004 | 561 | |
2010 | 478 | |
2005 | 241 | |
2010 | 222 |
Page 1
Page 1