arrow
Volume 36, Issue 6
A General Two-Level Subspace Method for Nonlinear Optimization

Cheng Chen, Zaiwen Wen & Yaxiang Yuan

J. Comp. Math., 36 (2018), pp. 881-902.

Published online: 2018-08

Export citation
  • Abstract

A new two-level subspace method is proposed for solving the general unconstrained minimization formulations discretized from infinite-dimensional optimization problems. At each iteration, the algorithm executes either a direct step on the current level or a coarse subspace correction step. In the coarse subspace correction step, we augment the traditional coarse grid space by a two-dimensional subspace spanned by the coordinate direction and the gradient direction at the current point. Global convergence is proved and convergence rate is studied under some mild conditions on the discretized functions. Preliminary numerical experiments on a few variational problems show that our two-level subspace method is promising.


  • AMS Subject Headings

65N06, 65B99

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address

cchen@lsec.cc.ac.cn (Cheng Chen)

wenzw@pku.edu.cn (Zaiwen Wen)

yyx@lsec.cc.ac.cn (Yaxiang Yuan)

  • BibTex
  • RIS
  • TXT
@Article{JCM-36-881, author = {Chen , ChengWen , Zaiwen and Yuan , Yaxiang}, title = {A General Two-Level Subspace Method for Nonlinear Optimization}, journal = {Journal of Computational Mathematics}, year = {2018}, volume = {36}, number = {6}, pages = {881--902}, abstract = {

A new two-level subspace method is proposed for solving the general unconstrained minimization formulations discretized from infinite-dimensional optimization problems. At each iteration, the algorithm executes either a direct step on the current level or a coarse subspace correction step. In the coarse subspace correction step, we augment the traditional coarse grid space by a two-dimensional subspace spanned by the coordinate direction and the gradient direction at the current point. Global convergence is proved and convergence rate is studied under some mild conditions on the discretized functions. Preliminary numerical experiments on a few variational problems show that our two-level subspace method is promising.


}, issn = {1991-7139}, doi = {https://doi.org/10.4208/jcm.1706-m2016-0721}, url = {http://global-sci.org/intro/article_detail/jcm/12607.html} }
TY - JOUR T1 - A General Two-Level Subspace Method for Nonlinear Optimization AU - Chen , Cheng AU - Wen , Zaiwen AU - Yuan , Yaxiang JO - Journal of Computational Mathematics VL - 6 SP - 881 EP - 902 PY - 2018 DA - 2018/08 SN - 36 DO - http://doi.org/10.4208/jcm.1706-m2016-0721 UR - https://global-sci.org/intro/article_detail/jcm/12607.html KW - Nonlinear optimization, Convex and nonconvex problems, Subspace technique, Multigrid/multilevel method, Large-scale problems. AB -

A new two-level subspace method is proposed for solving the general unconstrained minimization formulations discretized from infinite-dimensional optimization problems. At each iteration, the algorithm executes either a direct step on the current level or a coarse subspace correction step. In the coarse subspace correction step, we augment the traditional coarse grid space by a two-dimensional subspace spanned by the coordinate direction and the gradient direction at the current point. Global convergence is proved and convergence rate is studied under some mild conditions on the discretized functions. Preliminary numerical experiments on a few variational problems show that our two-level subspace method is promising.


Cheng Chen, Zaiwen Wen & Yaxiang Yuan. (2020). A General Two-Level Subspace Method for Nonlinear Optimization. Journal of Computational Mathematics. 36 (6). 881-902. doi:10.4208/jcm.1706-m2016-0721
Copy to clipboard
The citation has been copied to your clipboard