arrow
Volume 42, Issue 2
Modified Stochastic Extragradient Methods for Stochastic Variational Inequality

Ling Zhang & Lingling Xu

J. Comp. Math., 42 (2024), pp. 390-414.

Published online: 2024-01

Export citation
  • Abstract

In this paper, we consider two kinds of extragradient methods to solve the pseudo-monotone stochastic variational inequality problem. First, we present the modified stochastic extragradient method with constant step-size (MSEGMC) and prove the convergence of it. With the strong pseudo-monotone operator and the exponentially growing sample sequences, we establish the $R$-linear convergence rate in terms of the mean natural residual and the oracle complexity $O(1/\epsilon).$ Second, we propose a modified stochastic extragradient method with adaptive step-size (MSEGMA). In addition, the step-size of MSEGMA does not depend on the Lipschitz constant and without any line-search procedure. Finally, we use some numerical experiments to verify the effectiveness of the two algorithms.

  • AMS Subject Headings

47J20, 90C33, 90C25

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{JCM-42-390, author = {Zhang , Ling and Xu , Lingling}, title = {Modified Stochastic Extragradient Methods for Stochastic Variational Inequality}, journal = {Journal of Computational Mathematics}, year = {2024}, volume = {42}, number = {2}, pages = {390--414}, abstract = {

In this paper, we consider two kinds of extragradient methods to solve the pseudo-monotone stochastic variational inequality problem. First, we present the modified stochastic extragradient method with constant step-size (MSEGMC) and prove the convergence of it. With the strong pseudo-monotone operator and the exponentially growing sample sequences, we establish the $R$-linear convergence rate in terms of the mean natural residual and the oracle complexity $O(1/\epsilon).$ Second, we propose a modified stochastic extragradient method with adaptive step-size (MSEGMA). In addition, the step-size of MSEGMA does not depend on the Lipschitz constant and without any line-search procedure. Finally, we use some numerical experiments to verify the effectiveness of the two algorithms.

}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.2206-m2021-0195}, url = {http://global-sci.org/intro/article_detail/jcm/22886.html} }
TY - JOUR T1 - Modified Stochastic Extragradient Methods for Stochastic Variational Inequality AU - Zhang , Ling AU - Xu , Lingling JO - Journal of Computational Mathematics VL - 2 SP - 390 EP - 414 PY - 2024 DA - 2024/01 SN - 42 DO - http://doi.org/10.4208/jcm.2206-m2021-0195 UR - https://global-sci.org/intro/article_detail/jcm/22886.html KW - Stochastic variational inequality, Pseudo-monotone, Modified stochastic extragradient methods, Adaptive step-size. AB -

In this paper, we consider two kinds of extragradient methods to solve the pseudo-monotone stochastic variational inequality problem. First, we present the modified stochastic extragradient method with constant step-size (MSEGMC) and prove the convergence of it. With the strong pseudo-monotone operator and the exponentially growing sample sequences, we establish the $R$-linear convergence rate in terms of the mean natural residual and the oracle complexity $O(1/\epsilon).$ Second, we propose a modified stochastic extragradient method with adaptive step-size (MSEGMA). In addition, the step-size of MSEGMA does not depend on the Lipschitz constant and without any line-search procedure. Finally, we use some numerical experiments to verify the effectiveness of the two algorithms.

Ling Zhang & Lingling Xu. (2024). Modified Stochastic Extragradient Methods for Stochastic Variational Inequality. Journal of Computational Mathematics. 42 (2). 390-414. doi:10.4208/jcm.2206-m2021-0195
Copy to clipboard
The citation has been copied to your clipboard