Publication | Closed Access
Multi-class kernel logistic regression: a fixed-size implementation
53
Citations
15
References
2007
Year
Mathematical ProgrammingAlternating Descent VersionEngineeringMachine LearningSupport Vector MachineData ScienceData MiningPattern RecognitionFixed-size Ls-svmStatisticsSupervised LearningPredictive AnalyticsComputer ScienceStatistical Learning TheoryFixed-size ImplementationPractical Iterative AlgorithmReproducing Kernel MethodStatistical InferenceKernel Method
This research studies a practical iterative algorithm for multi-class kernel logistic regression (KLR). Starting from the negative penalized log likelihood criterium we show that the optimization problem in each iteration can be solved by a weighted version of least squares support vector machines (LS-SVMs). In this derivation it turns out that the global regularization term is reflected as a usual regularization in each separate step. In the LS-SVM framework, fixed-size LS-SVM is known to perform well on large data sets. We therefore implement this model to solve large scale multi-class KLR problems with estimation in the primal space. To reduce the size of the Hessian, an alternating descent version of Newton's method is used which has the extra advantage that it can be easily used in a distributed computing environment. It is investigated how a multi-class kernel logistic regression model compares to a one-versus-all coding scheme.
| Year | Citations | |
|---|---|---|
Page 1
Page 1