Publication | Open Access
Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization
29
Citations
17
References
2017
Year
Unknown Venue
Mathematical ProgrammingFirst AdmmModel OptimizationCom Svr AdmmEngineeringMachine LearningData ScienceStochastic OptimizationConvex OptimizationLarge Scale OptimizationStochastic Composition OptimizationInverse ProblemsComputer ScienceConvergence RateApproximation TheoryAdaptive Optimization
We consider the stochastic composition optimization problem proposed in \cite{wang2017stochastic}, which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM based algorithm named com SVR ADMM, and show that com SVR ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of $O(\logS/S)$, which improves upon the $O(S^{-4/9})$ rate in \cite{wang2016accelerating} when the objective is convex and Lipschitz smooth. Moreover, com SVR ADMM possesses a rate of $O(1/\sqrt{S})$ when the objective is convex but without Lipschitz smoothness. We also conduct experiments and show that it outperforms existing algorithms.
| Year | Citations | |
|---|---|---|
Page 1
Page 1