Abstract
Satisfying in the sufficient descent condition is a strength of a conjugate gradient method. Here, it is shown that under the Wolfe line search conditions the search directions generated by the memoryless BFGS conjugate gradient algorithm proposed by Shanno satisfy the sufficient descent condition for uniformly convex functions.
References
Andrei N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)
Andrei N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)
Andrei N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)
Andrei N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)
Andrei N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)
Birgin E., Martínez J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
Broyden C.G.: The convergence of a class of double-rank minimization algorithms. II. The new algorithm. J. Inst. Math. Appl. 6(1), 222–231 (1970)
Dai Y.H., Han J.Y., Liu G.H., Sun D.F., Yin H.X., Yuan Y.X.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 348–358 (1999)
Dai Y.H., Liao L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Dai Y.H., Ni Q.: Testing different conjugate gradient methods for large-scale unconstrained optimization. J. Comput. Math. 22(3), 311–320 (2003)
Dai Y.H., Yuan Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Du D., Pardalos P.M., Wu W.: Mathematical Theory of Optimization. Kluwer Academic Publishers, Dordrecht (2001)
Fletcher R.: A new approach to variable metric algorithms. Comput. J. 13(3), 317–322 (1970)
Gilbert J.C., Nocedal J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Goldfarb D.: A family of variable-metric methods derived by variational means. Math. Comput. 24(109), 23–26 (1970)
Hager W.W., Zhang H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager W.W., Zhang H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2(1), 35–58 (2006)
Hestenes M.R., Stiefel E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bureau Standards 49(6), 409–436 (1952)
Perry A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)
Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983). Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)
Shanno D.F.: Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24(111), 647–656 (1970)
Shanno D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3(3), 244–256 (1978)
Sun W., Yuan Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, (2006)
Wolfe P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)
Wolfe P.: Convergence conditions for ascent methods, II. Some corrections. SIAM Rev. 13(2), 185–188 (1971)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Babaie-Kafaki, S. On the sufficient descent property of the Shanno’s conjugate gradient method. Optim Lett 7, 831–837 (2013). https://doi.org/10.1007/s11590-012-0462-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-012-0462-z