Publication | Open Access
An Equivalence Between Sparse Approximation and Support Vector Machines
505
Citations
31
References
1998
Year
Svm TechniqueSupport Vector MachineSparse RepresentationEngineeringMachine LearningData SciencePattern RecognitionCompressive SensingSignal ReconstructionAtomic DecompositionInverse ProblemsComputer ScienceSupport Vector MachinesApproximation TheorySignal ProcessingKernel MethodBasis Pursuit DenoisingBasis Functions
This article shows a relationship between two different approximation techniques: the support vector machines (SVM), proposed by V. Vapnik (1995) and a sparse approximation scheme that resembles the basis pursuit denoising algorithm (Chen, 1995; Chen, Donoho, and Saunders, 1995). SVM is a technique that can be derived from the structural risk minimization principle (Vapnik, 1982) and can be used to estimate the parameters of several different approximation schemes, including radial basis functions, algebraic and trigonometric polynomials, B-splines, and some forms of multilayer perceptrons. Basis pursuit denoising is a sparse approximation technique in which a function is reconstructed by using a small number of basis functions chosen from a large set (the dictionary). We show that if the data are noiseless, the modified version of basis pursuit denoising proposed in this article is equivalent to SVM in the following sense: if applied to the same data set, the two techniques give the same solution, which is obtained by solving the same quadratic programming problem. In the appendix, we present a derivation of the SVM technique in one framework of regularization theory, rather than statistical learning theory, establishing a connection between SVM, sparse approximation, and regularization theory.
| Year | Citations | |
|---|---|---|
Page 1
Page 1