×

A family of the modified three-term Hestenes-Stiefel conjugate gradient method with sufficient descent and conjugacy conditions. (English) Zbl 1522.90266

Summary: To strengthen the three-term Hestenes-Stiefel conjugate gradient method proposed by Zhang et al., we suggest a modified version of it. For this purpose, by considering the Dai-Liao approach, the third term of Zhang et al. method is multiplied by a positive parameter which can be determined adaptively. To render an appropriate choice for the parameter of the search direction, we carry out a matrix analysis by which the sufficient descent property of the method is guaranteed. In the following, convergence analyses are discussed for convex and nonconvex cost functions. Eventually, numerical tests shed light on the efficiency of the performance of the proposed method.

MSC:

90C53 Methods of quasi-Newton type
90C06 Large-scale problems in mathematical programming
90C26 Nonconvex programming, global optimization

Software:

SCALCG; CUTEr; CG_DESCENT
Full Text: DOI

References:

[1] Amini, K.; Faramarzi, P.; Pirfalah, N., A modified Hestenes-Stiefel conjugate gradient method with an optimal property, Optim. Methods Softw., 34, 4, 770-782 (2019) · Zbl 1461.65114 · doi:10.1080/10556788.2018.1457150
[2] Aminifard, Z.; Babaie-Kafaki, S., Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing, Numer. Algorithms, 89, 3, 1369-1387 (2021) · Zbl 1484.65132 · doi:10.1007/s11075-021-01157-y
[3] Aminifard, Z.; Babaie-Kafaki, S., Matrix analyses on the Dai-Liao conjugate gradient method, ANZIAM J., 61, 2, 195-203 (2019) · Zbl 1415.90146 · doi:10.1017/S1446181119000063
[4] Andrei, N., An adaptive conjugate gradient algorithm for large-scale unconstrained optimization, J. Comput. Appl. Math., 292, 83-91 (2016) · Zbl 1321.90124 · doi:10.1016/j.cam.2015.07.003
[5] Andrei, N., A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues, Numer. Algorithms, 77, 4, 1273-1282 (2018) · Zbl 06860411 · doi:10.1007/s11075-017-0362-5
[6] Babaie-Kafaki, S.; Ghanbari, R., A descent family of Dai-Liao conjugate gradient methods, Optim. Methods Softw., 29, 3, 583-591 (2014) · Zbl 1285.90063 · doi:10.1080/10556788.2013.833199
[7] Babaie-Kafaki, S.; Ghanbari, R., The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices, Eur. J. Oper. Res., 234, 3, 625-630 (2014) · Zbl 1304.90216 · doi:10.1016/j.ejor.2013.11.012
[8] Babaie-Kafaki, S.; Ghanbari, R., A class of adaptive Dai-Liao conjugate gradient methods based on the scaled memoryless BFGS update, 4OR, 15, 1, 85-92 (2017) · Zbl 1360.90293 · doi:10.1007/s10288-016-0323-1
[9] Beale, EML, A derivation of conjugate gradients. In: numerical Methods for nonlinear Optimization, Numer. Algorithms, 42, 1, 63-73 (1972)
[10] Bojari, S.; Eslahchi, MR, Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization, Numer. Algorithms, 83, 3, 901-933 (2020) · Zbl 1436.90109 · doi:10.1007/s11075-019-00709-7
[11] Bojari, S., Eslahchi, M.R.: A five-parameter class of derivative-free spectral conjugate gradient methods for systems of large-scale nonlinear monotone equations. Int. J. Comput. Methods (2022) · Zbl 07714912
[12] Cao, J.; Wu, J., A conjugate gradient algorithm and its applications in image restoration, Appl. Numer. Math., 152, 243-252 (2020) · Zbl 07173173 · doi:10.1016/j.apnum.2019.12.002
[13] Dai, YH; Liao, LZ, New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43, 1, 87-101 (2001) · Zbl 0973.65050 · doi:10.1007/s002450010019
[14] Dolan, ED; Moré, JJ, Benchmarking optimization software with performance profiles, Math. Program., 91, 2, 201-213 (2002) · Zbl 1049.90004 · doi:10.1007/s101070100263
[15] Dong, XL; Liu, HW; He, YB; Yang, XM, A modified Hestenes-Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition, J. Comput. Appl. Math., 281, 239-249 (2015) · Zbl 1309.65074 · doi:10.1016/j.cam.2014.11.058
[16] Eslahchi, MR; Bojari, S., Global convergence of a new sufficient descent spectral three-term conjugate gradient class for large-scale optimization, Optim. Method. Softw., 37, 3, 830-843 (2022) · Zbl 1502.90170 · doi:10.1080/10556788.2020.1843167
[17] Exl, L.; Fischbacher, J.; Kovacs, A.; Oezelt, H.; Gusenbauer, M.; Schrefl, T., Preconditioned nonlinear conjugate gradient method for micromagnetic energy minimization, Comput. Phys. Comm., 235, 179-186 (2019) · Zbl 07682898 · doi:10.1016/j.cpc.2018.09.004
[18] Gilbert, JC; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2, 1, 21-42 (1992) · Zbl 0767.90082 · doi:10.1137/0802003
[19] Gould, NIM; Orban, D.; Toint, PhL, CUTEr: a constrained and unconstrained testing environment, revisited, ACM Trans. Math. Softw., 29, 4, 373-394 (2003) · Zbl 1068.90526 · doi:10.1145/962437.962439
[20] Hager, WW; Zhang, H., Algorithm 851: CG-descent, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw., 32, 1, 113-137 (2006) · Zbl 1346.90816 · doi:10.1145/1132973.1132979
[21] Hager, WW; Zhang, H., A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16, 1, 170-192 (2005) · Zbl 1093.90085 · doi:10.1137/030601880
[22] Hestenes, MR; Stiefel, E., Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand., 49, 6, 409-436 (1952) · Zbl 0048.09901 · doi:10.6028/jres.049.044
[23] Khoshsimaye-Bargard, M.; Ashrafi, A., A new descent spectral Polak-Ribière-Polyak method based on the memoryless BFGS update, Comput. Appl. Math., 40, 8, 1-17 (2021) · Zbl 1476.65112 · doi:10.1007/s40314-021-01662-9
[24] Li, L.; Xie, X.; Gao, T.; Wang, J., A modified conjugate gradient-based Elman neural network, Cogn. Syst. Res., 68, 62-72 (2021) · doi:10.1016/j.cogsys.2021.02.001
[25] Nocedal, J.; Wright, SJ, Numerical optimization (2006), New York: Springer, New York · Zbl 1104.65059
[26] Perry, A., A modified conjugate gradient algorithm, Oper. Res., 26, 6, 1073-1078 (1978) · Zbl 0419.90074 · doi:10.1287/opre.26.6.1073
[27] Powell, M.J.D: Nonconvex minimization calculations and the conjugate gradient method. In: Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, vol. 1066, pp. 122-141. Springer, Berlin (1984) · Zbl 0531.65035
[28] Powell, MJD, Convergence properties of algorithms for nonlinear optimization, SIAM Rev., 28, 4, 487-500 (1986) · Zbl 0624.90091 · doi:10.1137/1028154
[29] Sugiki, K.; Narushima, Y.; Yabe, H., Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl., 153, 3, 733-757 (2012) · Zbl 1262.90170 · doi:10.1007/s10957-011-9960-x
[30] Sun, W.; Yuan, YX, Optimization theory and methods: nonlinear programming (2006), New York: Springer, New York · Zbl 1129.90002
[31] Xue, W.; Wan, P.; Li, Q.; Zhong, P.; Yu, G.; Tao, T., An online conjugate gradient algorithm for large-scale data analysis in machine learning, AIMS Math., 6, 2, 1515-1537 (2021) · Zbl 1485.65062 · doi:10.3934/math.2021092
[32] Yao, S.; Feng, Q.; Li, L.; Xu, J., A class of globally convergent three-term dai-liao conjugate gradient methods, Appl. Numer. Math., 151, 354-366 (2020) · Zbl 1436.90139 · doi:10.1016/j.apnum.2019.12.026
[33] Zhang, L.; Zhou, W.; Li, D., Some descent three-term conjugate gradient methods and their global convergence, Optim. Method. Softw., 22, 4, 697-711 (2007) · Zbl 1220.90094 · doi:10.1080/10556780701223293
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.