Skip to main content
Log in

Global convergence of a modified limited memory BFGS method for non-convex minimization

  • Published:
Acta Mathematicae Applicatae Sinica, English Series Aims and scope Submit manuscript

Abstract

In this paper, a modified limited memory BFGS method for solving large-scale unconstrained optimization problems is proposed. A remarkable feature of the proposed method is that it possesses global convergence property without convexity assumption on the objective function. Under some suitable conditions, the global convergence of the proposed method is proved. Some numerical results are reported which illustrate that the proposed method is efficient.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Byrd, R. H., Nocedal, J. A tool for the analysis of quasi-Newton methods with application to unconstrained minimization. SIAM J. Numer. Anal., 26(3): 727–739 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  2. Byrd, R. H., Nocedal, J., Yuan, Y. Global convergence of a class of quasi-Newton methods on convex problems. SIAM J. Numer. Anal., 24(4): 1171–1189 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  3. Conn, A. R., Gould, N.I.M., Toint, Ph.L. CUTE: constrained and unconstrained testing environment. ACM Trans. Math. Softw., 21(1): 123–160 (1995)

    Article  MATH  Google Scholar 

  4. Dai, Y. H. Convergence properties of the BFGS algorithm. SIAM J. Optim., 13(2): 693–701 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dolan, E. D., Moré, J.J. Benchmarking optimization software with performance profiles. Math. Program., 91(2): 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  6. Li, D. H., Fukushima, M. A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math., 129(1): 15–35 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  7. Li, D. H., Fukushima, M. On the global convergence of the BFGS methods for nonconvex unconstrained optimization problems. SIAM J. Optim., 11(4): 1054–164 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  8. Liu, D., Nocedal, J. On the limited memory BFGS method for large-scale optimization. Math. Program., 45(1–3): 503–528 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  9. Mascarenhas, W. F. The BFGS method with exact line searchs fails for non-convex objective functions. Math. Program., 99(1): 49–61 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  10. Nocedal, J. Updating quasi-Newton matrices with limited storage. Math. Comput., 35(151): 773–782 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  11. Powell, M. J.D. Some global convergence properties of a variable metric algorithm for minimization without exact line searches-nonlinear Programming, Vol. 4, SIAM-AMS Proceedings. SIAM, Philadelpha, PA, 1976

    Google Scholar 

  12. Wei, Z., Li, G., Qi, L. New quasi-Newton methods for unconstrained optimization problems. Appl. Math. Comput., 175(22): 1156–1188 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  13. Wei, Z., Yu, G., Yuan, G., Lian, Z. The superlinear convergence of a modified BFGS-type method for uconstrained optimization. Comput. Optim. Appl., 29(3): 315–332 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  14. Xiao, Y., Wei, Z., Wang, Z. A limited memory BFGS-type method for large-scale unconstrained optimization. Computer Math. Appl., 56(4): 1001–1009 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  15. Yabea, H., Ogasawaraa, H., Yoshinob, M. Local and superlinear convergence of quasi-Newton methods based on modified secant conditions. J. Comput. Appl. Math., 205(1): 617–632 (2007)

    Article  MathSciNet  Google Scholar 

  16. Yang, Y., Xu, C. A compact limited memory method for large scale unconstrained optimization. Eur. J. Oper. Res., 180(1): 48–56 (2001)

    Google Scholar 

  17. Zhang, L. A globally convergen BFGS method for nonconvex minimization without line searches. Optim. Method. Softw., 20(6): 737–747 (2005)

    Article  Google Scholar 

  18. Zhang, L. Two modified Dai-Yuan nonlinear conjugate gradient methods. Numer. Algor., DOI 10.1007/s11075-008-9213-8

  19. Zhang, J. Z., Deng, N.Y., Chen, L.H. New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl., 102(1): 147–167 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  20. Zhang, L., Zhou, W., Li, D.H. Some decent three-term conjugate gradient methods and their global convergence. Optim. Method. Softw., 22(4): 697–711 (2007)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yun-hai Xiao.

Additional information

Supported by National Natural Science Foundation of China (Grant 11001075, 11161003), Post-doctoral Foundation of China grant 20090461094, and the Natural Science Foundation of Henan Province Eduction Department grant 2010B110004.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Xiao, Yh., Li, Tf. & Wei, Zx. Global convergence of a modified limited memory BFGS method for non-convex minimization. Acta Math. Appl. Sin. Engl. Ser. 29, 555–566 (2013). https://doi.org/10.1007/s10255-013-0233-3

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10255-013-0233-3

Keywords

2000 MR Subject Classification

Navigation