×

A new class of supermemory gradient methods. (English) Zbl 1111.65055

An unconstrained optimization problem consisting in minimizing a continuously differentiable function \(f:\mathbb R^n\to \mathbb R^1\) is considered. A new class of supermemory gradient methods for the unconstrained minimization problem is proposed. Conditions, which guarantee global convergence are presented. In a sufficiently small neighborhood of the optimal solution the algorithms can be reduced to quasi-Newton methods. Unlike to many line search methods, the proposed algorithms use not only the current iterative information, but also some information, which can be obtained from preceding iterations. Numerical results reported in the concluding part of the paper show the effectiveness of the proposed algorithms in practical computation.

MSC:

65K05 Numerical mathematical programming methods
90C30 Nonlinear programming
90C53 Methods of quasi-Newton type

Software:

L-BFGS; minpack
Full Text: DOI

References:

[1] Ariyawansa, K. A., Line search termination criteria for collinear scaling algorithms for minimizing a class of convex functions, Numer. Math., 80, 363-376 (1998) · Zbl 0916.65061
[2] Al-Baali, M., Improved Hessian approximations for the limited memory BFGS method, Numer. Algorithms, 22, 99-112 (1999) · Zbl 0949.65063
[3] Burdakov, O. P.; Martínez, J. M.; Pilotta, E. A., A limited-memory multipoint symmetric secant method for bound constrained optimization, Operations Research and Systems (CLAIO 2000), Part II (Mexico City). Operations Research and Systems (CLAIO 2000), Part II (Mexico City), Ann. Oper. Res., 117, 51-70 (2002) · Zbl 1025.90038
[4] Cantrell, J. W., On the relation between the memory gradient and the Fletcher-Reeves method, J. Optim. Theory Appl., 4, 1, 67-71 (1969) · Zbl 0167.08804
[5] Cragg, E. E.; Levy, A. V., Study on a supermemory gradient method for the minimization of functions, J. Optim. Theory Appl., 4, 3, 191-205 (1969) · Zbl 0172.19002
[6] Dixon, L. C.W., Conjugate directions without line searches, J. Inst. Math. Appl., 11, 317-328 (1973) · Zbl 0259.65060
[7] Dai, Y. H.; Yuan, Y., A nonlinear conjugate gradient with a strong Wolfe convergence property, SIAM J. Optim., 10, 177-182 (1999) · Zbl 0957.65061
[8] Dai, Y. H., New properties of a nonlinear conjugate gradient method, Numer. Math., 89, 83-98 (2001) · Zbl 1006.65063
[9] Dai, Y. H.; Yuan, Y., Global convergence of the method of shotest residuals, Numer. Math., 83, 581-598 (1999) · Zbl 0949.65065
[10] Dai, Y. H.; Yuan, Y. X., Convergence property of the conjugate descent method, Adv. Math., 25, 552-562 (1996) · Zbl 0871.49028
[11] Di Fiore, C.; Fanelli, S.; Lepore, F.; Zellini, P., Matrix algebras in quasi-Newton methods for unconstrained optimization, Numer. Math., 94, 479-500 (2003) · Zbl 1034.65045
[12] Dontchev, A. L.; Qi, H. D.; Qi, L., Convergence of Newton’s method for convex best interpolation, Numer. Math., 87, 435-456 (2001) · Zbl 0970.65009
[13] Fletcher, R.; Reeves, C., Function minimization by conjugate gradients, Comput. J., 7, 149-154 (1964) · Zbl 0132.11701
[14] Fletcher, R., Practical Methods of Optimization, vol. 1: Unconstrained Optimization (1987), John Wiley & Sons: John Wiley & Sons New York · Zbl 0905.65002
[15] Gilbert, J. C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2, 1, 21-42 (1992) · Zbl 0767.90082
[16] Grippo, L.; Lucidi, S., A globally convergent version of the Polak-Ribière conjugate gradient method, Math. Program., 78, 375-391 (1997) · Zbl 0887.90157
[17] Grippo, L.; Lampariello, F.; Lucidi, S., A class of nonmonotone stability methods in unconstrained optimization, Numer. Math., 62, 779-805 (1991) · Zbl 0724.90060
[18] Gill, P. E.; Leonard, M. W., Limited-memory reduced-Hessian methods for large-scale unconstrained optimization, SIAM J. Optim., 14, 380-401 (2003) · Zbl 1046.65045
[19] Gilbert, J. C.; Nocedal, J., Automatic differentiation and the step computation in the limited memory BFGS method, Appl. Math. Lett., 6, 47-50 (1993)
[20] Hu, Y. F.; Storey, C., Global convergence restart for conjugate gradient methods, J. Optim. Theory Appl., 71, 2, 399-405 (1991) · Zbl 0794.90063
[21] Huang, H. Y.; Chambliss, J. P., Quadratically convergent algorithms and one-dimensional search schemes, J. Optim. Theory Appl., 11, 2, 175-188 (1973) · Zbl 0238.65033
[22] Liu, D. C.; Nocedal, J., On the limited memory BFGS method for large scale optimization, Math. Program., 45, Ser. B, 503-528 (1989) · Zbl 0696.90048
[23] Miele, A.; Cantrell, J. W., Study on a memory gradient method for the minimization of functions, J. Optim. Theory Appl., 3, 6, 459-470 (1969) · Zbl 0165.22702
[24] Moré, J. J.; Garbow, B. S.; Hillstrom, K. E., Testing unconstrained optimization software, ACM Trans. Math. Software, 7, 1, 17-41 (1981) · Zbl 0454.65049
[25] Ni, Q., Limited memory quasi-Newton method for large-scale linearly equality-constrained minimization, Acta Math. Appl. Sinica (English Ser.), 16, 3, 320-328 (2000) · Zbl 0985.90069
[26] Morales, J. L.; Nocedal, J., Automatic preconditioning by limited memory quasi-Newton updating, SIAM J. Optim., 10, 4, 1079-1096 (2000) · Zbl 1020.65019
[27] Ni, Q., QP-free, truncated hybrid methods for large-scale nonlinear constrained optimization, J. Comput. Math., 15, 36-54 (1997) · Zbl 0872.65050
[28] Ni, Q.; Yuan, Y., A subspace limited memory quasi-Newton algorithm for large-scale nonlinear bound constrained optimization, Math. Comput., 66, 1509-1520 (1997) · Zbl 0886.65065
[29] Nash, S. G.; Nocedal, J., A numerical study of the limited memory BFGS method and the truncated-Newton method for large scale optimization, SIAM J. Optim., 1, 358-372 (1991) · Zbl 0756.65091
[30] Ortega, J. M.; Rheinboldt, W. C., Iterative Solution of Nonlinear Equations in Several Variables (1970), Academic Press: Academic Press New York · Zbl 0241.65046
[31] Sun, L. P., Updating the self-scaling symmetric rank one algorithm with limited memory for large-scale unconstrained optimization, Comput. Optim. Appl., 27, 23-29 (2004) · Zbl 1045.90064
[32] Shi, Z. J., Restricted PR conjugate gradient method and its global convergence, Adv. Math., 31, 1, 47-55 (2002), (in Chinese) · Zbl 1264.90179
[33] Shi, Z. J., Modified HS conjugate gradient method its global convergence, Math. Numer. Sinica, 23, 4, 393-406 (2001), (in Chinese) · Zbl 1495.90234
[34] Shi, Z. J.; Shen, J., A new descent algorithm with curve search rule, Appl. Math. Comput., 161, 753-768 (2005) · Zbl 1069.65065
[35] Shi, Z. J., Convergence of multi-step curve search method for unconstrained optimization, J. Numer. Math., 12, 4, 297-309 (2004) · Zbl 1068.65080
[36] Shi, Z. J.; Shen, J., A new super-memory gradient method with curve search rule, Appl. Math. Comput., 170, 1-16 (2005) · Zbl 1081.65058
[37] Shi, Z. J., Convergence of quasi-Newton method with new inexact line search, J. Math. Anal. Appl., 315, 120-131 (2006) · Zbl 1093.65063
[38] Shi, Z. J.; Shen, J., New inexact line search method for unconstrained optimization, J. Optim. Theory Appl., 127, 425-446 (2005) · Zbl 1116.90097
[39] Vlvcek, J.; Lukvsan, L., Shifted limited-memory variable metric methods for large-scale unconstrained optimization, J. Comput. Appl. Math., 186, 365-390 (2006) · Zbl 1080.65050
[40] Wolfe, M. A.; Viazminsky, C., Supermemory descent methods for unconstrained minimization, J. Optim. Theory Appl., 18, 4, 455-468 (1976) · Zbl 0304.49023
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.