Abstract
This paper is devoted to some adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Based on the recently proposed proximal version of the extragradient method for this class of problems, we study in detail the method with adaptively selected parameter values. The rate of convergence of this method is estimated. The result is generalized to the class of variational inequalities with relatively strongly monotone δ-generalized smooth operators. For the ridge regression problem and variational inequality associated with box-simplex games, numerical experiments are carried out to demonstrate the effectiveness of the proposed technique for adaptive parameter selection during the execution of the algorithm.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.REFERENCES
Stonyakin, F., Tyurin, A., Gasnikov, A., Dvurechensky, P., Agafonov, A., Dvinskikh, D., Alkousa, M., Pasechnyuk, D., Artamonov, S., and Piskunova, V., Inexact relative smoothness and strong convexity for optimization and variational inequalities by inexact model, Optim. Methods Software, 2021, vol. 36, no. 6, pp. 1155–1201.
Cohen, M.B., Sidford, A., and Tian, K., Relative Lipschitzness in extragradient methods and a direct recipe for acceleration, 2020. https://arxiv.org/pdf/2011.06572.pdf.
Titov, A.A., Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S., and Gasnikov, A., Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness, Lect. Notes Comput. Sci., Olenev, N., Evtushenko, Y., Jaćimović, M., Khachay, M., Malkova, V., and Pospelov, I., Eds., 2022, vol. 13781.
Bauschke, H.H., Bolte, J., and Teboulle, M., A descent lemma beyond Lipschitz gradient continuity: First-order methods revisited and applications, Math. Oper. Res., 2017, vol. 42, no. 2, pp. 330–348.
Lu, H., Freund, R.M., and Nesterov, Y., Relatively smooth convex optimization by first-order methods, and applications, SIAM J. Optim., 2018, vol. 28, no. 1, pp. 333–354.
Hendrikx, H., Xiao, L., Bubeck, S., Bach, F., and Massoulie, L., Statistically preconditioned accelerated gradient method for distributed optimization, Proc. Int. Conf. Machine Learning, 2020, pp. 4203–4227.
Tian, Y., Scutari, G., Cao, T., and Gasnikov, A., Acceleration in distributed optimization under similarity, Proc. 25th Int. Conf. Artificial Intelligence and Statistics, 2022, vol. 151, pp. 5721–5756.
Jin, Y., Sidford, A., and Tian, K., Sharper rates for separable minimax and finite sum optimization via primal-dual extragradient methods, Proc. Conf. Learning Theory, 2022, pp. 4362–4415.
Funding
The work carried out in Section 2 was partially supported by the Priority 2030 strategic academic leadership program, agreement no. 075-02-2021-1316 (dated September 9, 2021). The work carried out in Section 4 was supported by the grant from the President of the Russian Federation for the state support of leading scientific schools, grant no. NSh775.2022.1.1.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
The authors declare that they have no conflicts of interest.
Additional information
Translated by Yu. Kornienko
Rights and permissions
About this article
Cite this article
Ablaev, S.S., Stonyakin, F.S., Alkousa, M.S. et al. Adaptive Methods for Variational Inequalities with Relatively Smooth and Reletively Strongly Monotone Operators. Program Comput Soft 49, 485–492 (2023). https://doi.org/10.1134/S0361768823060026
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0361768823060026