An Accelerated Stochastic Trust Region Method for Stochastic Optimization
DOI:
https://doi.org/10.4208/jcm.2504-m2023-0228Keywords:
Stochastic optimization, Stochastic variance reduced gradient, Trust region, Gradient descent method, Machine learning.Abstract
In this paper, we propose an accelerated stochastic variance reduction gradient method with a trust-region-like framework, referred as the NMSVRG-TR method. Based on NMSVRG, we incorporate a Katyusha-like acceleration step into the stochastic trust region scheme, which improves the convergence rate of the SVRG methods. Under appropriate assumptions, the linear convergence of the algorithm is provided for strongly convex objective functions. Numerical experiment results show that our algorithm is generally superior to some existing stochastic gradient methods.
Downloads
Published
2025-09-28
Abstract View
- 4289
Pdf View
- 326
Issue
Section
Articles
How to Cite
An Accelerated Stochastic Trust Region Method for Stochastic Optimization. (2025). Journal of Computational Mathematics, 43(5), 1169-1193. https://doi.org/10.4208/jcm.2504-m2023-0228