Skip to main content
Log in

A family of accelerated hybrid conjugate gradient method for unconstrained optimization and image restoration

  • Original Research
  • Published:
Journal of Applied Mathematics and Computing Aims and scope Submit manuscript

Abstract

In this paper, a family of hybrid conjugate parameters with restart procedure is proposed. In which, we design a hybrid conjugate parameter by using two hybrid techniques, and set a restart procedure in its search direction according to the conjugate parameter. Under the usual assumptions and the weak Wolfe line search, we prove its sufficient descent property and global convergence. Finally, we choose a specific algorithm from this family and use an acceleration scheme to solve the large-scale unconstrained optimization problems and image restorations. These numerical results show that our algorithm is effective.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Data availability

Data will be made available on request.

Notes

  1. https://github.com/jhyin-optim/FHTTCGMs with applications

References

  1. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    MathSciNet  Google Scholar 

  2. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    MathSciNet  Google Scholar 

  3. Polak, E., Ribière, G.: Note surla convergence de directions conjugèes. Rev Fr Informat Rech Operationelle 3e Anneè 16(3), 35–43 (1969)

    Google Scholar 

  4. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Google Scholar 

  5. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    MathSciNet  Google Scholar 

  6. Wei, Z.X., Yao, S.W., Liu, L.Y.: The convergence properties of some new conjugate gradient methods. Appl. Math. Comput. 183(2), 1341–1350 (2006)

    MathSciNet  Google Scholar 

  7. Jiang, X.Z., Jian, J.B.: A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems. Nonlinear Dyn. 72, 101–112 (2013)

    MathSciNet  Google Scholar 

  8. Zhu, Z.B., Zhang, D.D., Wang, S.: Two modified DY conjugate gradient methods for unconstrained optimization problems. Appl. Math. Comput. 373(15), 125004 (2020)

    MathSciNet  Google Scholar 

  9. Touati-Ahmed, D., Storey, C.: Efficient hybrid conjugate gradient techniques. J. Optim. Theory Appl. 64(2), 379–397 (1990)

    MathSciNet  Google Scholar 

  10. Dai, Y.H., Yuan, Y.X.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103(1), 33–47 (2001)

    MathSciNet  Google Scholar 

  11. Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algor. 47(2), 143–156 (2008)

    MathSciNet  Google Scholar 

  12. Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141(2), 249–264 (2009)

    MathSciNet  Google Scholar 

  13. Jian, J.B., Han, L., Jiang, X.Z.: A hybrid conjugate gradient method with descent property for unconstrained optimization. Appl. Math. Model. 39(3–4), 1281–1290 (2015)

    MathSciNet  Google Scholar 

  14. Jiang, X.Z., Yang, H.H., Jian, J.B., et al.: Two families of hybrid conjugate gradient methods with restart procedures and their applications. Optim. Methods Softw. 38(5), 947–974 (2023)

    MathSciNet  Google Scholar 

  15. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    MathSciNet  Google Scholar 

  16. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)

    MathSciNet  Google Scholar 

  17. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)

    MathSciNet  Google Scholar 

  18. Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234, 625–630 (2014)

    MathSciNet  Google Scholar 

  19. Shao, H., Guo, H., Wu, X.Y., et al.: Two families of self-adjusting spectral hybrid DL conjugate gradient methods and applications in image denoising. Appl. Math. Model. 118, 393–411 (2023)

    MathSciNet  Google Scholar 

  20. Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    MathSciNet  Google Scholar 

  21. Jiang, X.Z., Liao, W., Yin, J.H., et al.: A new family of hybrid three-term conjugate gradient methods with applications in image restoration. Numer. Algor. 91, 161–191 (2022)

    MathSciNet  Google Scholar 

  22. Jiang, X.Z., Yang, H.H., Yin, J.H., et al.: A three-term conjugate gradient algorithm with restart procedure to solve image restoration problems. J. Comput. Appl. Math. 424, 115020 (2023)

    MathSciNet  Google Scholar 

  23. Babaie-Kafaki, S.: Two modified scaled nonlinear conjugate gradient methods. J. Comput. Appl. Math. 261, 172–182 (2014)

    MathSciNet  Google Scholar 

  24. Andrei, N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    MathSciNet  Google Scholar 

  25. Jian, J.B., Chen, Q., Jiang, X.Z., et al.: A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 503–515 (2017)

    MathSciNet  Google Scholar 

  26. Crowder, H., Wolfe, P.: Linear convergence of the conjugate gradient method. IBM J. Res. Dev. 16(4), 431–433 (1972)

    MathSciNet  Google Scholar 

  27. Powell, M.J.D.: Some convergence properties of the conjugate gradient method. Math. Program. 11(1), 42–49 (1976)

    MathSciNet  Google Scholar 

  28. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34(2), 319–330 (2011)

    MathSciNet  Google Scholar 

  29. Jiang, X.Z., Zhu, Y.H., Jian, J.B.: Two efficient nonlinear conjugate gradient methods with restart procedures and their applications in image restoration. Nonlinear Dyn. 111(6), 5469–5498 (2023)

    Google Scholar 

  30. Yin, J.H., Jian, J.B., Jiang, X.Z., et al.: A hybrid three-term conjugate gradient projection method for constrained nonlinear monotone equations with applications. Numer. Algor. 88(1), 389–418 (2022)

    MathSciNet  Google Scholar 

  31. Liu, P.J., Wu, X.Y., Shao, H., et al.: Three adaptive hybrid derivative-free projection methods for constrained monotone nonlinear equations and their applications. Numer. Linear Algebra Appl. 30(2), 2471 (2022)

    MathSciNet  Google Scholar 

  32. Liu, Y.F., Zhu, Z.B., Zhang, B.X.: Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing. J. Appl. Math. Comput. 68(3), 1787–1816 (2022)

    MathSciNet  Google Scholar 

  33. Kou, C.X., Yang, H.: A mini-batch stochastic conjugate gradient algorithm with variance reduction. J. Glob. Optim. 87, 1–17 (2022)

    MathSciNet  Google Scholar 

  34. Yang, Z.: Adaptive stochastic conjugate gradient for machine learning. Expert Syst. Appl. 206, 117719 (2022)

    Google Scholar 

  35. Shi, Z.J., Guo, J.: A new family of conjugate gradient methods. J. Comput. Appl. Math. 224(1), 444–457 (2009)

    MathSciNet  Google Scholar 

  36. Nazareth, J.L.: Conjugate-gradient methods. In: Floudas, C., Pardalos, P. (eds.) Encyclopedia of Optimization. Kluwer Academic Publishers, Boston (1999)

    Google Scholar 

  37. Jiang, X.Z., Ye, X.M., Huang, Z.F., et al.: A family of hybrid conjugate gradient method with restart procedure for unconstrained optimizations and image restorations. Comput. Oper. Res. 159, 106–341 (2023)

    MathSciNet  Google Scholar 

  38. Zoutendijk, G.: Nonlinear programming computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    Google Scholar 

  39. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)

    MathSciNet  Google Scholar 

  40. Kou, C.X., Dai, Y.H.: A modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224 (2015)

    MathSciNet  Google Scholar 

  41. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    MathSciNet  Google Scholar 

  42. Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141(2), 249–264 (2009)

    MathSciNet  Google Scholar 

  43. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr and SifDec: a constrained and unconstrained testing environment. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Google Scholar 

  44. Moré, J.J., Garbow, B.S., Hillstrome, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)

    MathSciNet  Google Scholar 

  45. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  Google Scholar 

  46. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. program. 91(2), 201–213 (2002)

    MathSciNet  Google Scholar 

  47. Chan, R.H., Ho, C.W., Nikolova, M.: Salt-and-pepper noise removal by median-type noise detectors and detail-preserving regularization. IEEE Trans. Image Process. 14(10), 1479–1485 (2005)

    Google Scholar 

  48. Charbonnier, P., Blanc-Féraud, L., Aubert, G., et al.: Deterministic edge-preserving regularization in computed imaging. IEEE T. Image Process. 6(2), 298–311 (1997)

    Google Scholar 

  49. Cai, J.F., Chan, R., Morini, B.: Minimization of an edge-preserving regularization functional by conjugate gradient type methods. Image Processing Based on Partial Differential Equations. Springer, Berlin, Heidelberg. pp. 109–122 (2007)

  50. Bovik, A.C.: Handbook of Image and Video Processing. Academic Press, San Diego (2010)

    Google Scholar 

  51. Hwang, H., Haddad, R.A.: Adaptive median filters: New algorithms and results. IEEE Trans. Image Process. 4(4), 499–502 (1995)

    Google Scholar 

Download references

Acknowledgements

Project supported by the National Natural Science Foundation of China (Grant No. 12361063), Guangxi Science and Technology Program (Grant No. AD23023001), the Natural Science Foundation of Guangxi Province (Grant No.2016GXNSFAA380028).

Author information

Authors and Affiliations

Authors

Contributions

Xiaodi Wu: Validation, Algorithm design, Theoretical analysis, Writing-review. Xiaomin Ye: Theoretical analysis, Numerical experiments, Writing-review editing. Daolan Han: Numerical experiments.

Corresponding author

Correspondence to Xiaomin Ye.

Ethics declarations

Conflict of interest

The authors declared no potential conflicts of interest with respect to the research, authorship, and publication of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Project supported by the National Natural Science Foundation of China (Grant No. 12361063), Guangxi Science and Technology Program (Grant No. AD23023001), the Natural Science Foundation of Guangxi Province (Grant No.2016GXNSFAA380028).

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, X., Ye, X. & Han, D. A family of accelerated hybrid conjugate gradient method for unconstrained optimization and image restoration. J. Appl. Math. Comput. 70, 2677–2699 (2024). https://doi.org/10.1007/s12190-024-02069-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12190-024-02069-5

Keywords

MSC code

Navigation