Publication | Closed Access
Fast and scalable polynomial kernels via explicit feature maps
317
Citations
29
References
2013
Year
Unknown Venue
Geometric LearningEngineeringMachine LearningTensor SketchingRobust FeatureData ScienceData MiningPattern RecognitionRandom MappingExplicit Feature MapsNon-linear KernelsApproximation TheorySupervised LearningMachine VisionFeature LearningKnowledge DiscoveryComputer ScienceDeep LearningRandom Feature MappingReproducing Kernel MethodKernel Method
Approximation of non-linear kernels using random feature mapping has been successfully employed in large-scale data analysis applications, accelerating the training of kernel machines. While previous random feature mappings run in O(ndD) time for $n$ training samples in d-dimensional space and D random feature maps, we propose a novel randomized tensor product technique, called Tensor Sketching, for approximating any polynomial kernel in O(n(d+D \log{D})) time. Also, we introduce both absolute and relative error bounds for our approximation to guarantee the reliability of our estimation algorithm. Empirically, Tensor Sketching achieves higher accuracy and often runs orders of magnitude faster than the state-of-the-art approach for large-scale real-world datasets.
| Year | Citations | |
|---|---|---|
Page 1
Page 1