Skip to main content
Log in

Adaptive Methods for Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators

  • Published:
Programming and Computer Software Aims and scope Submit manuscript

Abstract

This paper is devoted to some adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Based on the recently proposed proximal version of the extragradient method for this class of problems, we study in detail the method with adaptively selected parameter values. The rate of convergence of this method is estimated. The result is generalized to the class of variational inequalities with relatively strongly monotone δ-generalized smooth operators. For the ridge regression problem and variational inequality associated with box-simplex games, numerical experiments are carried out to demonstrate the effectiveness of the proposed technique for adaptive parameter selection during the execution of the algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

REFERENCES

  1. Stonyakin, F., Tyurin, A., Gasnikov, A., Dvurechensky, P., Agafonov, A., Dvinskikh, D., Alkousa, M., Pasechnyuk, D., Artamonov, S., and Piskunova, V., Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model, Optim. Methods Software, 2021, vol. 36, no. 6, pp. 1155–1201.

    Article  MathSciNet  MATH  Google Scholar 

  2. Cohen, M.B., Sidford, A., and Tian, K., Relative Lipschitzness in extragradient methods and a direct recipe for acceleration, 2020. https://arxiv.org/pdf/2011.06572.pdf.

  3. Titov, A.A., Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S., and Gasnikov, A., Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, Lect. Notes Comput. Sci., Olenev, N., Evtushenko, Y., Jaćimović, M., Khachay, M., Malkova, V., and Pospelov, I., Eds., 2022, vol. 13781.

    MATH  Google Scholar 

  4. Bauschke, H.H., Bolte, J., and Teboulle, M., A descent lemma beyond Lipschitz gradient continuity: First-order methods revisited and applications, Math. Oper. Res., 2017, vol. 42, no. 2, pp. 330–348.

    Article  MathSciNet  MATH  Google Scholar 

  5. Lu, H., Freund, R.M., and Nesterov, Y., Relatively smooth convex optimization by first-order methods, and applications, SIAM J. Optim., 2018, vol. 28, no. 1, pp. 333–354.

    Article  MathSciNet  MATH  Google Scholar 

  6. Hendrikx, H., Xiao, L., Bubeck, S., Bach, F., and Massoulie, L., Statistically preconditioned accelerated gradient method for distributed optimization, Proc. Int. Conf. Machine Learning, 2020, pp. 4203–4227.

  7. Tian, Y., Scutari, G., Cao, T., and Gasnikov, A., Acceleration in distributed optimization under similarity, Proc. 25th Int. Conf. Artificial Intelligence and Statistics, 2022, vol. 151, pp. 5721–5756.

  8. Jin, Y., Sidford, A., and Tian, K., Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods, Proc. Conf. Learning Theory, 2022, pp. 4362–4415.

Download references

Funding

The work carried out in Section 2 was partially supported by the Priority 2030 strategic academic leadership program, agreement no. 075-02-2021-1316 (dated September 9, 2021). The work carried out in Section 4 was supported by the grant from the President of the Russian Federation for the state support of leading scientific schools, grant no. NSh775.2022.1.1.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to S. S. Ablaev, F. S. Stonyakin, M. S. Alkousa or D. A. Pasechnyk.

Ethics declarations

The authors declare that they have no conflicts of interest.

Additional information

Translated by Yu. Kornienko

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S. et al. Adaptive Methods for Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators. Program Comput Soft 49, 485–492 (2023). https://doi.org/10.1134/S0361768823060026

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0361768823060026

Navigation