Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Weak convergence of iterative methods for solving quasimonotone variational inequalities

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

In this work, we introduce self-adaptive methods for solving variational inequalities with Lipschitz continuous and quasimonotone mapping(or Lipschitz continuous mapping without monotonicity) in real Hilbert space. Under suitable assumptions, the convergence of algorithms are established without the knowledge of the Lipschitz constant of the mapping. The results obtained in this paper extend some recent results in the literature. Some preliminary numerical experiments and comparisons are reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Cottle, R.W., Yao, J.C.: Pseudo-monotone complementarity problems in Hilbert space. J. Optim. Theory Appl. 75, 281–295 (1992)

    Article  MathSciNet  Google Scholar 

  2. Ye, M.L., He, Y.R.: A double projection method for solving variational inequalities without monotonicity. Comput. Optim. Appl. 60, 141–150 (2015)

    Article  MathSciNet  Google Scholar 

  3. Langenberg, N.: An interior proximal method for a class of quasimonotone variational inequalities. J. Optim. Theory Appl. 155, 902–922 (2012)

    Article  MathSciNet  Google Scholar 

  4. Brito, A.S., da Cruz Neto, J.X., Lopes, J.O., Oliveira, P.R.: Interior proximal algorithm for quasiconvex programming problems and variational inequalities with linear constraints. J. Optim. Theory Appl. 154, 217–234 (2012)

    Article  MathSciNet  Google Scholar 

  5. Korpelevich, G.M.: The extragradient method for finding saddle points and other problem. Ekonomika i Matematicheskie Metody 12, 747–756 (1976)

    MathSciNet  MATH  Google Scholar 

  6. Noor, M.A.: Some developments in general variational inequalities. Appl. Math. Comput. 152, 199–277 (2004)

    MathSciNet  MATH  Google Scholar 

  7. Censor, Y., Gibali, A., Reich, S.: The subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 148, 318–335 (2011)

    Article  MathSciNet  Google Scholar 

  8. Tseng, P.: A modified forward-backward splitting method for maximal monotone mapping. SIAM J. Control Optim. 38, 431–446 (2000)

    Article  MathSciNet  Google Scholar 

  9. Solodov, M.V., Svaiter, B.F.: A new projection method for monotone variational inequalities. SIAM J. Control Optim. 37, 765–776 (1999)

    Article  MathSciNet  Google Scholar 

  10. Malitsky, YuV: Projected reflected gradient methods for variational inequalities. SIAM J. Optim. 25(1), 502–520 (2015)

    Article  MathSciNet  Google Scholar 

  11. Facchinei, F., Pang, J.-S.: Finite-Dimensional Variational Inequalities and Complementarity problem. Springer, New York (2003)

    MATH  Google Scholar 

  12. Iusem, A.N., Svaiter, B.F.: A variant of Korpelevich’s method for variational inequalities with a new search strategy. Optimization 42, 309–321 (1997)

    Article  MathSciNet  Google Scholar 

  13. Duong, V.T., Dang, V.H.: Weak and strong convergence throrems for variational inequality problems. Numer. Algorithm 78(4), 1045–1060 (2018)

    Article  Google Scholar 

  14. Mainge, F.: A hybrid extragradient-viscosity method for monotone operators and fixed point problems. SIAM J. Control Optim. 47, 1499–1515 (2008)

    Article  MathSciNet  Google Scholar 

  15. Dong, Q.L., Cho, Y.J., Zhong, L., Rassias, T.M.: Inertial projection and contraction algorithms for variational inequalities. J. Global Optim. 70(3), 687–704 (2018)

    Article  MathSciNet  Google Scholar 

  16. Rapeepan, K., Satit, S.: Strong convergence of the Halpern subgradient extragradient method for solving variational inequalities in Hilbert space. J. Optim. Theory Appl. 163, 399–412 (2014)

    Article  MathSciNet  Google Scholar 

  17. Yekini, S., Olaniyi, S.I.: Strong convergence result for monotone variational inequalities. Numer. Algorithm 76, 259–282 (2017)

    Article  MathSciNet  Google Scholar 

  18. Antipin, A.S.: On a method for convex programs using a symmetrical modification of the Lagrange function. Ekonomika i Matematicheskie Metody 12(6), 1164–1173 (1976)

    Google Scholar 

  19. Yang, J., Liu, H.W.: Strong convergence result for solving monotone variational inequalities in Hilbert space. Numer. Algorithm 80, 741–752 (2019)

    Article  MathSciNet  Google Scholar 

  20. Yang, J., Liu, H.W., Liu, Z.X.: Modified subgradient extragradient algorithms for solving monotone variational inequalities. Optimization 67(12), 2247–2258 (2018)

    Article  MathSciNet  Google Scholar 

  21. Yang, J., Liu, H.W.: A modified projected gradient method for monotone variational inequalities. J. Optim. Theory Appl. 179(1), 197–211 (2018)

    Article  MathSciNet  Google Scholar 

  22. Phan, T.V.: On the weak convergence of the extragradient method for solving pseudo-monotone variational inequalities. J. Optim. Theory Appl. 176, 399–409 (2018)

    Article  MathSciNet  Google Scholar 

  23. Khobotov, E.N.: Modification of the extra-gradient method for solving variational inequalities and certain optimization problems. USSR Comput. Math. Math. Phys. 27, 120–127 (1987)

    Article  MathSciNet  Google Scholar 

  24. Tinti, F.: Numerical solution for pseudomonotone variational inequality problems by extragradient methods. Var. Anal. Appl. 79, 1101–1128 (2004)

    MathSciNet  MATH  Google Scholar 

  25. Hieu, D.V., Anh, P.K., Muu, L.D.: Modified extragradient-like algorithms with new stepsizes for variational inequalities. Comput. Optim. Appl. 73, 913–932 (2019)

    Article  MathSciNet  Google Scholar 

  26. Hieu, D.V., Cho, Y.J., Xiao, Y.-B.: Golden ratio algorithms with new stepsize rules for variational inequalities. Math. Methods Appl. Sci. (2019). https://doi.org/10.1002/mma.5703

  27. Thong, D.V., Hieu, D.V.: Strong convergence of extragradient methods with a new step size for solving variational inequality problems. Comput. Appl. Math. 38, 136 (2019)

    Article  MathSciNet  Google Scholar 

  28. Marcotte, P., Zhu, D.L.: A cutting plane method for solving quasimonotone variational inequalities. Comput. Optim. Appl. 20, 317–324 (2001)

    Article  MathSciNet  Google Scholar 

  29. Sun, D.F.: A new step-size skill for solving a class of nonlinear equations. J. Comput. Math. 13, 357–368 (1995)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jun Yang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, H., Yang, J. Weak convergence of iterative methods for solving quasimonotone variational inequalities. Comput Optim Appl 77, 491–508 (2020). https://doi.org/10.1007/s10589-020-00217-8

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-020-00217-8

Keywords

Mathematics Subject Classification

Navigation