×

Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. (English) Zbl 1258.65059

Conjugate gradient methods are very efficient for solving large-scale unconstrained optimization problems. The main advantage of these methods is that no matrices are used which highly decreases storage requirements. When computing a search direction, a lot of various choices for the parameter \(\beta\) (characterizing the conjugate gradient method) and their modifications exist. In order to incorporate the second-order information of the objective function into conjugate gradient methods, many variants based on secant conditions have been proposed. Although such methods have global convergence property, they do not necessarily satisfy the (sufficient) descent condition.
In this paper, the authors propose new efficient conjugate gradient methods for solving large-scale unconstrained optimization problems that generate descent search directions and are globally convergent for general objective functions. The methods combine ideas based on the secant condition proposed by Y.-H. Dai and L.-Z. Liao [Appl. Math. Optim. 43, No. 1, 87–101 (2001; Zbl 0973.65050)] and the formula for \(\beta\) proposed byW. W. Hager and H. Zhang [SIAM J. Optim. 16, No. 1, 170–192 (2005; Zbl 1093.90085)].

MSC:

65K05 Numerical mathematical programming methods
90C30 Nonlinear programming
90C06 Large-scale problems in mathematical programming

Software:

CUTE; CG_DESCENT; CUTEr
Full Text: DOI

References:

[1] Hestenes, M. R.; Stiefel, E., Methods of conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards, 49, 409-436 (1952) · Zbl 0048.09901
[2] Fletcher, R.; Reeves, C. M., Function minimization by conjugate gradients, The Computer Journal, 7, 149-154 (1964) · Zbl 0132.11701
[3] Nocedal, J.; Wright, S. J., (Numerical Optimization. Numerical Optimization, Springer Series in Operations Research (2006), Springer-Verlag: Springer-Verlag New York) · Zbl 1104.65059
[4] Gilbert, J. C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization, 2, 21-42 (1992) · Zbl 0767.90082
[5] Dai, Y. H.; Yuan, Y., A nonlinear conjugate gradient method with a strong global convergence property, SIAM Journal on Optimization, 10, 177-182 (1999) · Zbl 0957.65061
[6] Hager, W. W.; Zhang, H., A survey of nonlinear conjugate gradient methods, Pacific Journal of Optimization, 2, 35-58 (2006) · Zbl 1117.90048
[7] Dai, Y. H.; Liao, L. Z., New conjugacy conditions and related nonlinear conjugate gradient methods, Applied Mathematics and Optimization, 43, 87-101 (2001) · Zbl 0973.65050
[8] Yabe, H.; Takano, M., Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Computational Optimization and Applications, 28, 203-225 (2004) · Zbl 1056.90130
[9] Ford, J. A.; Narushima, Y.; Yabe, H., Multi-step nonlinear conjugate gradient methods for unconstrained minimization, Computational Optimization and Applications, 40, 191-216 (2008) · Zbl 1181.90221
[10] Zhou, W.; Zhang, L., A nonlinear conjugate gradient method based on the MBFGS secant condition, Optimization Methods and Software, 21, 707-714 (2006) · Zbl 1112.90096
[11] Kobayashi, M.; Narushima, Y.; Yabe, H., Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems, Journal of Computational and Applied Mathematics, 234, 375-397 (2010) · Zbl 1223.65043
[12] K. Sugiki, Y. Narushima, H. Yabe, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, Journal of Optimization Theory and Applications, in press (http://dx.doi.org/10.1007/s10957-011-9960-x; K. Sugiki, Y. Narushima, H. Yabe, Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, Journal of Optimization Theory and Applications, in press (http://dx.doi.org/10.1007/s10957-011-9960-x · Zbl 1262.90170
[13] Narushima, Y.; Yabe, H.; Ford, J. A., A three-term conjugate gradient method with sufficient descent property for unconstrained optimization, SIAM Journal on Optimization, 21, 212-230 (2011) · Zbl 1250.90087
[14] Hager, W. W.; Zhang, H., A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM Journal on Optimization, 16, 170-192 (2005) · Zbl 1093.90085
[15] Yu, G.; Guan, L.; Li, G., Global convergence of modified Polak-Ribiére-Polyak conjugate gradient methods with sufficient descent property, Journal of Industrial and Management Optimization, 4, 565-579 (2008) · Zbl 1168.65030
[16] Yuan, G., Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems, Optimization Letters, 3, 11-21 (2009) · Zbl 1154.90623
[17] Perry, A., A modified conjugate gradient algorithm, Operations Research, 26, 1073-1078 (1978) · Zbl 0419.90074
[18] Zhang, J. Z.; Deng, N. Y.; Chen, L. H., New quasi-Newton equation and related methods for unconstrained optimization, Journal of Optimization Theory and Applications, 102, 147-167 (1999) · Zbl 0991.90135
[19] Zhang, J. Z.; Xu, C. X., Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, Journal of Computational and Applied Mathematics, 137, 269-278 (2001) · Zbl 1001.65065
[20] Li, D. H.; Fukushima, M., A modified BFGS method and its global convergence in nonconvex minimization, Journal of Computational and Applied Mathematics, 129, 15-35 (2001) · Zbl 0984.65055
[21] Ford, J. A.; Moghrabi, I. A., Alternative parameter choices for multi-step quasi-Newton methods, Optimization Methods and Software, 2, 357-370 (1993)
[22] Ford, J. A.; Moghrabi, I. A., Multi-step quasi-Newton methods for optimization, Journal of Computational and Applied Mathematics, 50, 305-323 (1994) · Zbl 0807.65062
[23] Zhang, L., New versions of the Hestenes-Stiefel nonlinear conjugate gradient method based on the secant condition for optimization, Computational & Applied Mathematics, 28, 111-133 (2009) · Zbl 1168.65032
[24] Zoutendijk, G., Nonlinear programming, computational methods, (Abadie, J., Integer and Nonlinear Programming (1970), North-Holland: North-Holland Amsterdam), 37-86 · Zbl 0336.90057
[25] Bongartz, I.; Conn, A. R.; Gould, N. I.M.; Toint, P. L., CUTE: constrained and unconstrained testing environment, ACM Transactions on Mathematical Software, 21, 123-160 (1995) · Zbl 0886.65058
[26] N.I.M. Gould, D. Orban, P.L. Toint, CUTEr web site. http://www.cuter.rl.ac.uk/; N.I.M. Gould, D. Orban, P.L. Toint, CUTEr web site. http://www.cuter.rl.ac.uk/
[27] Hager, W. W.; Zhang, H., Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent, ACM Transactions on Mathematical Software, 32, 113-137 (2006) · Zbl 1346.90816
[28] W.W. Hager, H. Zhang, CG_DESCENT version 1.4 User’s Guide, University of Florida, November 2005. http://www.math.ufl.edu/ hager/; W.W. Hager, H. Zhang, CG_DESCENT version 1.4 User’s Guide, University of Florida, November 2005. http://www.math.ufl.edu/ hager/
[29] Dolan, E. D.; Moré, J. J., Benchmarking optimization software with performance profiles, Mathematical Programming, 91, 201-213 (2002) · Zbl 1049.90004
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.