×

Scaled conjugate gradient algorithms for unconstrained optimization. (English) Zbl 1168.90608

Summary: We present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by E. G. Birgin and J. M. Martínez [Appl. Math. Optimization 43, No. 2, 117–128 (2001; Zbl 0990.90134)], which is mainly a scaled variant of A. Perry’s [Northwestern University, Center for Mathematical Studies in Economics and Management Science, Evanston, IL, discussion paper 269 (1977)], is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale-Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm.

MSC:

90C30 Nonlinear programming
90C52 Methods of reduced gradient type

Citations:

Zbl 0990.90134
Full Text: DOI

References:

[1] Andrei, N.: A new gradient descent method for unconstrained optimization. ICI Technical report (March 2004)
[2] Barzilai, J., Borwein, J.M.: Two point step size gradient method. IMA J. Numer. Anal. 8, 141–148 (1988) · Zbl 0638.65055 · doi:10.1093/imanum/8.1.141
[3] Birgin, E., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43, 117–128 (2001) · Zbl 0990.90134 · doi:10.1007/s00245-001-0003-0
[4] Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995) · Zbl 0886.65058 · doi:10.1145/200979.201043
[5] Cauchy, A.: Méthodes générales pour la résolution des systèmes d’équations simultanées. C. R. Acad. Sci. Paris 25, 536–538 (1847)
[6] Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001) · Zbl 0973.65050 · doi:10.1007/s002450010019
[7] Fletcher, R.: On the Barzilai–Borwein method. Numerical analysis report NA/207 (2001) · Zbl 1118.90318
[8] Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964) · Zbl 0132.11701 · doi:10.1093/comjnl/7.2.149
[9] Golub, G.H., O’Leary, D.P.: Some history of the conjugate gradient and Lanczos algorithms: 1948–1976. SIAM Rev. 31, 50–102 (1989) · Zbl 0673.65017 · doi:10.1137/1031003
[10] Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005) · Zbl 1093.90085 · doi:10.1137/030601880
[11] Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006) · Zbl 1117.90048
[12] Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. Sect. B 48, 409–436 (1952) · Zbl 0048.09901
[13] Perry, J.M.: A class of conjugate gradient algorithms with a two step variable metric memory. Discussion paper 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University (1977)
[14] Polak, E., Ribière, G.: Note sur la convergence de méthods de directions conjugées. Rev. Française Inform. Res. Opér. 16, 35–43 (1969)
[15] Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977) · Zbl 0396.90072 · doi:10.1007/BF01593790
[16] Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997) · Zbl 0898.90119 · doi:10.1137/S1052623494266365
[17] Shanno, D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978) · Zbl 0399.90077 · doi:10.1287/moor.3.3.244
[18] Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978) · Zbl 0408.90071 · doi:10.1137/0715085
[19] Shanno, D.F., Phua, K.H.: Algorithm 500. Minimization of unconstrained multivariate functions [E4]. ACM Trans. Math. Softw. 2, 87–94 (1976) · Zbl 0319.65042 · doi:10.1145/355666.355673
[20] Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969) · Zbl 0177.20603 · doi:10.1137/1011036
[21] Wolfe, P.: Convergence conditions for ascent methods II: some corrections. SIAM Rev. 13, 185–188 (1971) · Zbl 0216.26901 · doi:10.1137/1013035
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.