Publication | Closed Access
Distributed Generalized Cross-Validation for Divide-and-Conquer Kernel Ridge Regression and Its Asymptotic Optimality
21
Citations
18
References
2019
Year
Support Vector MachineAsymptotic OptimalityEngineeringMachine LearningData ScienceDistributed Generalized Cross-validationHigh-dimensional MethodHyperparameter EstimationPredictive AnalyticsReproducing Kernel MethodFeature SelectionStatistical InferenceParameter SelectionStatistical Learning TheoryKernel Ridge RegressionStatisticsGeneralized Cross-validationKernel Method
Tuning parameter selection is of critical importance for kernel ridge regression. To date, a data-driven tuning method for divide-and-conquer kernel ridge regression (d-KRR) has been lacking in the literature, which limits the applicability of d-KRR for large datasets. In this article, by modifying the generalized cross-validation (GCV) score, we propose a distributed generalized cross-validation (dGCV) as a data-driven tool for selecting the tuning parameters in d-KRR. Not only the proposed dGCV is computationally scalable for massive datasets, it is also shown, under mild conditions, to be asymptotically optimal in the sense that minimizing the dGCV score is equivalent to minimizing the true global conditional empirical loss of the averaged function estimator, extending the existing optimality results of GCV to the divide-and-conquer framework. Supplemental materials for this article are available online.
| Year | Citations | |
|---|---|---|
Page 1
Page 1