×

New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method. (English) Zbl 1445.90100

Summary: Three new procedures for computation the scaling parameter in the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno search direction with a parameter are presented. The first two are based on clustering the eigenvalues of the self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno iteration matrix with a parameter by using the determinant or the trace of this matrix. The third one is based on minimizing the measure function of R. H. Byrd and J. Nocedal [SIAM J. Numer. Anal. 26, No. 3, 727–739 (1989; Zbl 0676.65061)]. For all these three algorithms the sufficient descent condition is established. The stepsize is computed using the standard Wolfe line search. Under the standard Wolfe line search the global convergence of these algorithms is established. By using 80 unconstrained optimization test problems, with different structures and complexities, it is shown that the performances of the self-scaling memoryless algorithms based on the determinant or on the trace of the iteration matrix or on minimizing the measure function are better than those of CG_DESCENT (version 1.4) with Wolfe line search [W. W. Hager and H. Zhang, SIAM J. Optim. 16, No. 1, 170–192 (2005; Zbl 1093.90085)], the self-scaling memoryless BFGS algorithms with scaling parameter proposed by S. S. Oren and E. Spedicato [Math. Program. 10, 70–90 (1976; Zbl 0342.90045)] and by S. S. Oren and D. G. Luenberger [Manage. Sci., Theory 20, 845–862 (1974; Zbl 0316.90064)], LBFGS by D. C. Liu and J. Nocedal [Math. Program. 45, No. 3 (B), 503–528 (1989; Zbl 0696.90048)] and the standard BFGS. The self-scaling memoryless algorithm based on minimizing the measure function of Byrd and Nocedal [loc. cit.] is slightly top performer versus the same algorithms based on the determinant or on the trace of the iteration matrix.

MSC:

90C30 Nonlinear programming
Full Text: DOI

References:

[1] Al-Baali, M., Numerical experience with a class of self-scaling quasi-Newton algorithms, J. Optim. Theory Appl., 96, 3, 533-553 (1998) · Zbl 0907.90240
[2] Andrei, N.: Critica Raţiunii Algoritmilor de Optimizare fără Restricţii. [Criticism of the Unconstrained Optimization Algorithms Reasoning]. Editura Academiei Române, Bucureşti (2009)
[3] Andrei, N., Acceleration of conjugate gradient algorithms for unconstrained optimization, Appl. Math. Comput., 213, 2, 361-369 (2009) · Zbl 1172.65027
[4] Andrei, N., A new three-term conjugate gradient algorithm for unconstrained optimization, Numer. Algorithms, 68, 305-321 (2015) · Zbl 1321.65092
[5] Andrei, N., An adaptive conjugate gradient algorithm for large-scale unconstrained optimization, J. Comput. Appl. Math., 292, 83-91 (2016) · Zbl 1321.90124
[6] Andrei, N., Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization, Optim. Methods Softw., 32, 3, 534-551 (2017) · Zbl 1368.49057
[7] Andrei, N.: UOP—a collection of 80 unconstrained optimization test problems. Technical Report No. 7/2018, November 17, Research Institute for Informatics, Bucharest, Romania. (2018)
[8] Axelsson, O.; Lindskog, G., On the rate of convergence of the preconditioned conjugate gradient methods, Numer. Math., 48, 499-523 (1986) · Zbl 0564.65017
[9] Babaie-Kafaki, S., On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae, J. Optim. Theory Appl., 167, 1, 91-101 (2015) · Zbl 1327.90394
[10] Babaie-Kafaki, S., A modified scaling parameter for the memoryless BFGS updating formula, Numer Algorithms, 72, 2, 425-433 (2016) · Zbl 1342.90227
[11] Birgin, E.; Martínez, JM, A spectral conjugate gradient method for unconstrained optimization, Appl. Math. Optim., 43, 2, 117-128 (2001) · Zbl 0990.90134
[12] Bongartz, I.; Conn, AR; Gould, NIM; Toint, PL, CUTE: constrained and unconstrained testing environments, ACM TOMS, 21, 123-160 (1995) · Zbl 0886.65058
[13] Broyden, CG, The convergence of a class of double-rank minimization algorithms. I. General considerations, J. Inst. Math. Appl., 6, 76-90 (1970) · Zbl 0223.65023
[14] Byrd, R.; Nocedal, J., A tool for the analysis of quasi-Newton methods with application to unconstrainedminimization, SIAM J. Numer. Anal., 26, 727-739 (1989) · Zbl 0676.65061
[15] Dai, YH; Wang, Y.; Yagola, AG; Yang, C., Chapter 8: convergence analysis of nonlinear conjugate gradient methods, Optimization and Regularization for Computational Inverse Problems and Applications, 157-181 (2010), Berlin: Higher Education Press, Beijing and Springer, Berlin · Zbl 1218.65054
[16] Dai, YH; Kou, CX, A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search, SIAM J. Optim., 23, 1, 296-320 (2013) · Zbl 1266.49065
[17] Dai, YH; Liao, LZ, New conjugate conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim., 43, 87-101 (2001) · Zbl 0973.65050
[18] Dennis, JE; Wolkowicz, H., Sizing and least-change secant methods, SIAM J. Numer. Anal., 30, 5, 1291-1314 (1993) · Zbl 0802.65081
[19] Dolan, ED; Moré, JJ, Benchmarking optimization software with performance profiles, Math. Program., 91, 201-213 (2002) · Zbl 1049.90004
[20] Fletcher, R., A new approach to variable metric algorithms, Comput. J., 13, 317-322 (1970) · Zbl 0207.17402
[21] Fletcher, R., A new variational result for quasi-Newton formulae, SIAM J. Optim., 1, 18-21 (1991) · Zbl 0752.90064
[22] Gilbert, JC; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2, 21-42 (1992) · Zbl 0767.90082
[23] Goldfarb, D., A family of variable metric method derived by variation mean, Math. Comput., 23, 23-26 (1970) · Zbl 0196.18002
[24] Hager, WW; Zhang, H., A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J. Optim., 16, 170-192 (2005) · Zbl 1093.90085
[25] Hager, WW; Zhang, H., Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent, ACM Trans. Math. Softw., 32, 1, 113-137 (2006) · Zbl 1346.90816
[26] Hager, WW; Zhang, H., The limited memory conjugate gradient method, SIAM J. Optim., 23, 4, 2150-2168 (2013) · Zbl 1298.90129
[27] Hestenes, MR; Stiefel, E., Methods of conjugate gradients for solving linear systems, J. Res. Natl. Bur. Stand., 49, 409-436 (1952) · Zbl 0048.09901
[28] Kaporin, IE, New convergence results and preconditioning strategies for the conjugate gradient methods, Numer. Linear Algebr., 1, 179-210 (1994) · Zbl 0837.65027
[29] Kratzer, D.; Parter, SV; Steuerwalt, M., Block splittings for the conjugate gradient method, Comput. Fluid., 11, 255-279 (1983) · Zbl 0526.76003
[30] Liu, DC; Nocedal, J., On the limited-memory BFGS method for large optimization, Math. Program., 45, 503-528 (1989) · Zbl 0696.90048
[31] Luenberger, DG, Introduction to linear and nonlinear programming (1984), Reading: Addison-Wesley Publishing Company, Reading · Zbl 0571.90051
[32] Nocedal, J., Updating quasi-Newton matrices with limited storage, Math. Comput., 35, 773-782 (1980) · Zbl 0464.65037
[33] Nocedal, J.; Yuan, YX, Analysis of self-scaling quasi-Newton method, Math. Program., 61, 19-37 (1993) · Zbl 0794.90067
[34] Oren, SS; Spedicato, E., Optimal conditioning of self-scaling variable metric algorithm, Math. Program., 10, 70-90 (1976) · Zbl 0342.90045
[35] Oren, SS, Self-scaling variable metric (SSVM) algorithms. Part II: implementation and experiments, Manag. Sci., 20, 5, 863-874 (1974) · Zbl 0316.90065
[36] Oren, SS; Luenberger, DG, Self-scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms, Manag. Sci., 20, 845-862 (1974) · Zbl 0316.90064
[37] Perry, A.: A class of conjugate gradient algorithms with two step variable metric memory. Discussion paper 269, Center for Mathematical Studies in Economics and Management Science. Northwestern University, Il, USA. (1977)
[38] Powell, MJD; Griffiths, DF, Nonconvex minimization calculations and the conjugate gradient method, Numerical Analysis (Dundee, 1983), Lecture Notes in Mathematics, 122-141 (1984) · Zbl 0531.65035
[39] Shanno, DF, Conditioning of quasi-Newton methods for function minimization, Math. Comput., 24, 647-656 (1970) · Zbl 0225.65073
[40] Shanno, DF, Conjugate gradient methods with inexact searches, Math. Oper. Res., 3, 244-256 (1978) · Zbl 0399.90077
[41] Winther, R., Some superlinear convergence results for the conjugate gradient method, SIAM J. Numer. Anal., 17, 14-17 (1980) · Zbl 0447.65021
[42] Wolfe, P., Convergence conditions for ascent methods, SIAM Rev., 11, 226-235 (1969) · Zbl 0177.20603
[43] Wolfe, P., Convergence conditions for ascent methods. II: some corrections, SIAM Rev., 13, 185-188 (1971) · Zbl 0216.26901
[44] Zoutendijk, G.; Abadie, J., Nonlinear programming, computational methods, Integer and Nonlinear Programming, 38-86 (1970), Amsterdam: North-Holland, Amsterdam · Zbl 0321.00011
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.