Abstract
Based on a singular value analysis conducted on the Dai–Liao conjugate gradient method, it is shown that when the gradient approximately lies in the direction of the maximum magnification by the search direction matrix, the method may get into some computational errors and also, the convergence may occur hardly. Hence, we obtain a formula for computing the Dai–Liao parameter which makes the direction of the maximum magnification by the search direction matrix to be orthogonal to the gradient. We briefly discuss global convergence of the corresponding Dai–Liao method with and without convexity assumption on the objective function. Numerical experiments on a set of test problems of the CUTEr collection show practical effectiveness of the suggested adaptive choice of the Dai–Liao parameter in the sense of the Dolan–Moré performance profile.
Similar content being viewed by others
References
Andrei N (2007) Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud Inform Control 16(4):333–352
Andrei N (2011) Open problems in conjugate gradient algorithms for unconstrained optimization. Bull Malays Math Sci Soc 34(2):319–330
Andrei N (2016) An adaptive conjugate gradient algorithm for large-scale unconstrained optimization. J Comput Appl Math 292(1):83–91
Andrei N (2017) A Dai–Liao conjugate gradient algorithm with clustering of eigenvalues. Numer Algorithms 77:1273–1282. https://doi.org/10.1007/s11075-017-0362-5
Babaie-Kafaki S (2014) On the sufficient descent condition of the Hager–Zhang conjugate gradient methods. 4OR 12(3):285–292
Babaie-Kafaki S, Ghanbari R (2014) The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur J Oper Res 234(3):625–630
Babaie-Kafaki S, Ghanbari R (2014) A descent family of Dai–Liao conjugate gradient methods. Optim Methods Softw 29(3):583–591
Babaie-Kafaki S, Ghanbari R (2014) Two modified three-term conjugate gradient methods with sufficient descent property. Optim Lett 8(8):2285–2297
Babaie-Kafaki S, Ghanbari R (2015) Two optimal Dai–Liao conjugate gradient methods. Optimization 64(11):2277–2287
Babaie-Kafaki S, Ghanbari R (2017a) A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR 15(1):85–92
Babaie-Kafaki S, Ghanbari R (2017b) A class of descent four-term extension of the Dai–Liao conjugate gradient method based on the scaled memoryless BFGS update. J Ind Manag Optim 3(2):649–658
Babaie-Kafaki S, Ghanbari R (2017c) Extensions of the Hestenes-Stiefel and Polak-Ribière-Polyak conjugate gradient methods with sufficient descent property. Bull Iran Math Soc 43(7):2437–2448
Babaie-Kafaki S, Ghanbari R (2017d) Two adaptive Dai–Liao nonlinear conjugate gradient methods. Iran J Sci Technol Trans Sci 42:1505–1509. https://doi.org/10.1007/s40995-017-0271-4
Babaie-Kafaki S, Ghanbari R, Mahdavi-Amiri N (2010) Two new conjugate gradient methods based on modified secant equations. J Comput Appl Math 234(5):1374–1386
Dai YH, Han JY, Liu GH, Sun DF, Yin HX, Yuan YX (1999) Convergence properties of nonlinear conjugate gradient methods. SIAM J Optim 10(2):348–358
Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320
Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43(1):87–101
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2):201–213
Fatemi M (2016) A new efficient conjugate gradient method for unconstrained optimization. J Comput Appl Math 300(1):207–216
Fatemi M (2016) An optimal parameter for Dai–Liao family of conjugate gradient methods. J Optim Theory Appl 169(2):587–605
Fatemi M, Babaie-Kafaki S (2016) Two extensions of the Dai–Liao method with sufficient desent property based on a penalization scheme. Bull Comput Appl Math 4(1):7–19
Ford JA, Narushima Y, Yabe H (2008) Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput Optim Appl 40(2):191–216
Gilbert JC, Nocedal J (1992) Global convergence properties of conjugate gradient methods for optimization. SIAM J Optim 2(1):21–42
Gould NIM, Orban D, Toint PhL (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394
Hager WW, Zhang H (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192
Hager WW, Zhang H (2006) Algorithm 851: CG\(_{-}\)Descent, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137
Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58
Hestenes MR, Stiefel E (1952) Methods of conjugate gradients for solving linear systems. J Res Natl Bur Stand 49(6):409–436
Li G, Tang C, Wei Z (2007) New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J Comput Appl Math 202(2):523–539
Livieris IE, Pintelas P (2012) A descent Dai–Liao conjugate gradient method based on a modified secant equation and its global convergence. ISRN Comput Math 2012:8 Article ID 435495
Narushima Y, Yabe H (2012) Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization. J Comput Appl Math 236(17):4303–4317
Nocedal J, Wright SJ (2006) Numerical optimization. Springer, New York
Perry A (1976) A modified conjugate gradient algorithm. Oper Res 26(6):1073–1078
Peyghami MR, Ahmadzadeh H, Fazli A (2015) A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family. Optim Methods Softw 30(4):843–863
Powell MJD (1986) Convergence properties of algorithms for nonlinear optimization. SIAM Rev 28(4):487–500
Sun W, Yuan YX (2006) Optimization theory and methods: nonlinear programming. Springer, New York
Watkins DS (2002) Fundamentals of matrix computations. Wiley, New York
Zhou W, Zhang L (2006) A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim Methods Softw 21(5):707–714
Acknowledgements
This research was supported by Research Council of Semnan University (Grant no. 139704261033). The authors thank the anonymous Reviewers and the Associate Editor for their valuable comments and suggestions helped to improve the quality of this work. They are also grateful to Professor Michael Navon for providing the line search code.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Aminifard, Z., Babaie-Kafaki, S. An optimal parameter choice for the Dai–Liao family of conjugate gradient methods by avoiding a direction of the maximum magnification by the search direction matrix. 4OR-Q J Oper Res 17, 317–330 (2019). https://doi.org/10.1007/s10288-018-0387-1
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10288-018-0387-1
Keywords
- Nonlinear programming
- Unconstrained optimization
- Conjugate gradient method
- Maximum magnification
- Global convergence