×

A new descent algorithm with curve search rule. (English) Zbl 1069.65065

A globally convergent curve search algorithm for solving unconstrained minimization problems is developed. The curves which underly the step direction and step size procedure at each iteration are rational expressions in the curve parameter \(\alpha\). Nominator and denominator depend linearly on \(\alpha\). There exist some similarities with conjugate gradient methods, and Wolfe’s line search rules are considered at the step procedures, too. Numerical experiments allow to compare the method proposed with some standard algorithms.

MSC:

65K05 Numerical mathematical programming methods
90C30 Nonlinear programming
Full Text: DOI

References:

[1] Bertsekas, D. P., Constrained Optimization and Lagrange Multiplier Methods (1982), Academic Press Inc · Zbl 0453.65045
[2] Botsaris, C. A., Differential gradient methods, J. Math. Anal. Appl, 63, 177-198 (1978) · Zbl 0405.65040
[3] Cohen, A. I., Stepsize analysis for descent methods, JOTA, 33, 2, 187-205 (1981) · Zbl 0421.49030
[4] Cragg, E. E.; Levy, A. V., Study on a supermemory gradient method for the minimization of functions, JOTA, 4, 3, 191-205 (1969) · Zbl 0172.19002
[5] Cantrell, J. W., On the relation between the memory gradient and the Fletcher-Reeves method, JOTA, 4, 1, 67-71 (1969) · Zbl 0167.08804
[6] Dixon, L. C.W., Conjugate directions without line searches, J. Inst. Math. Appl, 11, 317-328 (1973) · Zbl 0259.65060
[7] Evtushenko, Y. G., Numerical Optimization Techniques (1985), Optimization Software Inc., Publications Division: Optimization Software Inc., Publications Division New York
[8] Ford, J. A.; Tharmlikit, S., New implicit updates in multi-step quasi-Newton methods for unconstrained optimization, J. Comput. Appl. Math, 152, 133-146 (2003) · Zbl 1025.65035
[9] Ford, J. A.; Moghrabi, I. A., Using function-values in multi-step quasi-Newton methods, J. Comput. Appl. Math, 66, 201-211 (1996) · Zbl 0856.65073
[10] Ford, J. A.; Moghrabi, I. A., Minimum curvature multistep quasi-Newton methods, Comput. Math. Appl, 31, 4/5, 179-186 (1996) · Zbl 0874.65046
[11] Fletcher, R.; Reeves, C., Function minimization by conjugate gradients, Comput. J, 7, 149-154 (1964) · Zbl 0132.11701
[12] Grippo, L.; Lucidi, S., A globally convergent version of the Polak-Ribière conjugate gradient method, Math. Program, 78, 375-391 (1997) · Zbl 0887.90157
[13] Gilbert, J. C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim, 2, 1, 21-42 (1992) · Zbl 0767.90082
[14] Grippo, L.; Lampariello, F.; Lucidi, S., A class of nonmonotone stability methods in unconstrained optimization, Numer. Math, 62, 779-805 (1991) · Zbl 0724.90060
[15] Himmelblau, D. M., A uniform evaluation of unconstrained optimization techniques, (Lootsma, F. A., Numerical Methods for Nonlinear Optimization (1982), Academic Press), 69-98 · Zbl 0267.65053
[16] Hestenes, M. R., Conjugate Direction Methods in Optimization (1980), Springer-Verlag Inc: Springer-Verlag Inc New York · Zbl 0421.49033
[17] Huang, H. Y.; Chambliss, J. P., Quadratically convergent algorithms and one-dimensional search schemes, JOTA, 11, 2, 175-188 (1973) · Zbl 0238.65033
[18] Himmelblau, D. M., Applied Nonlinear Programming (Appendix A) (1972), McGraw-Hill Book Company · Zbl 0267.65053
[19] Miele, A.; Cantrell, J. W., Study on a memory gradient method for the minimization of functions, JOTA, 3, 6, 459-470 (1969) · Zbl 0165.22702
[20] Nocedal, J.; Stephen, J. W., Numerical Optimization (1999), Springer-Verlag Inc: Springer-Verlag Inc New York · Zbl 0930.65067
[21] Powell, M. J.D., Restart procedures for the conjugate gradient method, Math. Program, 12, 241-254 (1977) · Zbl 0396.90072
[22] Powell, M. J.D., Some convergence properties of the conjugate gradient method, Math. Program, 11, 42-49 (1976) · Zbl 0356.65056
[23] Qiu, Y. P.; Shi, Z. J., A new descent method for unconstrained optimization problem, Adv. Math, 29, 2, 183-184 (2000)
[24] Shi, Z. J., On super-memory gradient method with exact line searches, Asian Pacific J. Oper. Res, 20, 3 (2003)
[25] Shi, Z. J., Restricted PR conjugate gradient method and its convergence, Adv. Math, 31, 1, 47-55 (2002) · Zbl 1264.90179
[26] Shi, Z. J., Modified HS conjugate gradient method and its global convergence, Math. Numer. Sinica, 23, 4, 333-406 (2001) · Zbl 1495.90234
[27] Shi, Z. J., A supermemory gradient method for unconstrained optimization problem, J. Eng. Math, 17, 2, 99-104 (2000) · Zbl 0970.90520
[28] Schropp, J., A note on minimization problems and multistep methods, Numer. Math, 78, 87-101 (1997) · Zbl 0884.34021
[29] Syman, J. A., A new and dynamic method for unconstrained minimization, Appl. Math. Model, 6, 449-462 (1982) · Zbl 0501.65026
[30] Vrahatis, M. N.; Androulakis, G. S.; Lambrinos, J. N.; Magoulas, G. D., A class of gradient unconstrained minimization algorithms with adaptive stepsize, J. Comput. Appl. Math, 114, 367-386 (2000) · Zbl 0958.65072
[31] van Wyk, D. J., Differential optimization techniques, Appl. Math. Model, 8, 419-424 (1984) · Zbl 0555.90092
[32] Wolfe, M. A.; Viazminsky, C., Supermemory descent methods for unconstrained minimization, JOTA, 18, 4, 455-468 (1976) · Zbl 0304.49023
[33] Wu, X.; Xia, J.; Ouyang, Z., Note on global convergence of ODE methods for unconstrained optimization, Appl. Math. Comput, 125, 311-315 (2002) · Zbl 1032.90047
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.