Publication | Closed Access
Sparse approximation using least squares support vector machines
314
Citations
11
References
2002
Year
Unknown Venue
Function Estimation VapnikSupport Vector MachineEngineeringMachine LearningData ScienceSparse RepresentationPattern RecognitionSupport ValuesComputer ScienceRidge RegressionStatistical Learning TheoryKernel MethodApproximation TheoryLow-rank ApproximationSparse Approximation
In least squares support vector machines (LS-SVMs) for function estimation Vapnik's /spl epsiv/-insensitive loss function has been replaced by a cost function which corresponds to a form of ridge regression. In this way nonlinear function estimation is done by solving a linear set of equations instead of solving a quadratic programming problem. The LS-SVM formulation also involves less tuning parameters. However, a drawback is that sparseness is lost in the LS-SVM case. In this paper we investigate imposing sparseness by pruning support values from the sorted support value spectrum which results from the solution to the linear system.
| Year | Citations | |
|---|---|---|
1999 | 26.9K | |
1999 | 9.3K | |
1988 | 5.9K | |
1990 | 3.3K | |
1989 | 2.6K | |
1993 | 1.9K | |
1998 | 802 | |
1998 | 622 | |
1998 | 505 | |
1999 | 221 |
Page 1
Page 1