Skip to main content
Log in

Accelerated augmented Lagrangian method for total variation minimization

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

The augmented Lagrangian method (TVAL3) (Li et al. in Comput Optim Appl 56(3):507–530, 2013), which combines an alternating direction technique with a nonmonotone line search to minimize the augmented Lagrangian function at each iteration, is a very efficient method for total variation image restoration. In this paper we present an accelerated augmented Lagrangian based on TVAL3 for total variation image restoration. It is widely accepted that the stepsize, is very crucial to the performance of numerical algorithms, especially for gradient-based methods. We design a new quadratic approximation model to generate an efficient approximately optimal stepsize, truncate it by the two well-known BB stepsizes and use the resulted approximately optimal stepsize to accelerate the augmented Lagrangian method. We establish the convergence of the proposed method under weaker condition. Numerical experiments show that the proposed method is very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  • Barzilai J, Borwein JM (1988) Two-point step size gradient methods. IMA J Numer Anal 8:141–148

    Article  MathSciNet  MATH  Google Scholar 

  • Beck A, Teboulle M (2009) A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J Imaging Sci 2(1):183–202

    Article  MathSciNet  MATH  Google Scholar 

  • Becker S, Bobin J, Cands EJ (2009) NESTA: a fast and accurate first-order method for sparse recovery. SIAM J Imaging Sci 4(1):1–39

    Article  MathSciNet  MATH  Google Scholar 

  • Bioucas-Dias JM, Figueiredo MAT (2007) A new TwIST: two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Trans Image Process 16(12):2992–3004

    Article  MathSciNet  Google Scholar 

  • Bioucas-Dias JM, Figueiredo MAT (2007) Two-step algorithms for linear inverse problems with non-quadratic regularization. In: IEEE international conference on image processing, San Antonio, pp I-105–I-108

  • Dai YH, Liao LZ (2002) R-linear convergence of the Barzilai and Borwein gradient method. IMA J Numer Anal 22(1):1–10

    Article  MathSciNet  MATH  Google Scholar 

  • Figueiredo MAT, Nowak RD, Wright SJ (2007) Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J Sel Top Signal Process 1(4):586–597

    Article  Google Scholar 

  • Ford J, Moghrabi I (1993) Alternative parameter choices for multi-step quasi-Newton methods. Optim Methods Softw 2(3):357–370

    Article  Google Scholar 

  • Ford J, Moghrabi I (1994) Multi-step quasi-Newton methods for optimization. J Comput Appl Math 50(93):305–323

    Article  MathSciNet  MATH  Google Scholar 

  • Hale ET, Yin WT, Zhang Y (2008) Fixed-point continuation for \( l_1 \)-minimization: methodology and convergence. SIAM J Optim 19(3):1107–1130

    Article  MathSciNet  MATH  Google Scholar 

  • Han BS, Yang YH (2019) An integro-PDE model with variable motility. Nonlinear Anal. Real World Appl. 45:186–199

    Article  MathSciNet  MATH  Google Scholar 

  • Huang YK, Liu HW (2015) A Barzilai–Borwein type method for minimizing composite functions. Numer Algorithms 69(4):819–838

    Article  MathSciNet  MATH  Google Scholar 

  • Huang YK, Liu HW (2016) Smoothing projected Barzilai–Borwein method for constrained non-Lipschitz optimization. Comput Optim Appl 65(3):671–698

    Article  MathSciNet  MATH  Google Scholar 

  • Huang YK, Liu HW, Zhou S (2014) A Barzilai–Borwein type method for stochastic linear complementarity problem. Numer Algorithms 67(3):477–489

    Article  MathSciNet  MATH  Google Scholar 

  • Huang YK, Liu HW, Zhou S (2015) An efficient monotone projected Barzilai–Borwein method for nonnegative matrix factorization. Appl Math Lett 45:12–17

    Article  MathSciNet  MATH  Google Scholar 

  • Krejic N, Jerinkic NK, Rapajic S (2015) Barzilai–Borwein method with variable sample size for stochastic linear complementarity problems. Optimization 65(2):479–499

    Article  MathSciNet  MATH  Google Scholar 

  • Li CB, Yin W, Jiang H, Zhang Y (2013) An efficient augmented Lagrangian method with applications to total variation minimization. Comput Optim Appl 56(3):507–530

    Article  MathSciNet  MATH  Google Scholar 

  • Li C, Zhang Y, Yin W http://www.caam.rice.edu/~optimization/L1/TVAL3/

  • Liu ZX, Liu HW (2018a) An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer Algorithms. 78(1):21–39

    Article  MathSciNet  MATH  Google Scholar 

  • Liu ZX, Liu HW (2018b) Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J Comput Appl Math. 328:400–413

    Article  MathSciNet  MATH  Google Scholar 

  • Liu ZX, Liu HW, Dong XL (2018) An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3):427–440

    Article  MathSciNet  MATH  Google Scholar 

  • Molina B, Raydan M (1996) Preconditioned Barzilai-Borwein method for the numerical solution of partial differential equations. Numer. Algorithms 13(1):45–60

    Article  MathSciNet  MATH  Google Scholar 

  • Nesterov Y (2005) Smooth minimization of non-smooth functions. Math Program 103(1):127–152

    Article  MathSciNet  MATH  Google Scholar 

  • Osher S, Burger M, Goldfarb D, Xu JJ, Yin WT (2005) An iterative regularization method for total variation based image restoration. SIAM J Multiscale Model Simul 4(2):460–489

    Article  MathSciNet  MATH  Google Scholar 

  • Raydan M (1993) On the Barzilai and Borwein choice of steplength for the gradient method. IMA J Numer Anal 13(3):321–326

    Article  MathSciNet  MATH  Google Scholar 

  • Raydan M (1997) The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J Optim 7(1):26–33

    Article  MathSciNet  MATH  Google Scholar 

  • Rudin L, Osher S, Fatemi E (1992) Nonlinear total variation based noise removal algorithms. Phys D Nonlinear Phenom 60(1–4):259–268

    Article  MathSciNet  MATH  Google Scholar 

  • Sopyla K, Drozda P (2015) Stochastic gradient descent with Barzilai–Borwein update step for SVM. Inf Sci 316:218–233

    Article  MATH  Google Scholar 

  • Wang YL, Yang JF, Yin WT, Zhang Y (2008) A new alternating minimization algorithm for total variation image reconstruction. SIAM J Imaging Sci 1(3):248–272

    Article  MathSciNet  MATH  Google Scholar 

  • Wright SJ, Nowak RD, Figueiredo MAT (2009) Sparse reconstruction by separable approximation. IEEE Trans Signal Process 57(7):3373–3376

    Article  MathSciNet  MATH  Google Scholar 

  • Yuan YX, Sun WY (1999) Theory and methods of optimization. Science Press of China, Beijing

    Google Scholar 

  • Zhang HC, Hager WW (2004) A nonmonotone line search technique and its application to unconstrained optimization. SIAM J Optim 14:1043–1056

    Article  MathSciNet  MATH  Google Scholar 

  • Zheng YT, Zheng B (2017) A new modified Barzilai–Borwein gradient method for the quadratic minimization problem. J Optim Theory Appl 172(1):179–186

    Article  MathSciNet  MATH  Google Scholar 

  • Zhou B, Gao L, Dai YH (2006) Gradient methods with adaptive stepsizes. Comput Optim Appl 35(1):69–86

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This research is supported by National Science Foundation of China (No. 11461021), Guangxi Science Foundation (Nos. 2017GXNSFBA198031, 2018GXNSFBA281180), Shangxi Science Foundation (No. 2017JM1014), Scientific Research Foundation of Guangxi Education Department (No. 2013YB236), Scientific Research Project of Hezhou University (Nos. 2014YBZK06, 2016HZXYSX03), Guangxi Colleges and Universities Key Laboratory of Symbolic Computation and Engineering Data Processing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zexian Liu.

Additional information

Communicated by Paulo J. S. Silva.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Z., Liu, H. & Wang, X. Accelerated augmented Lagrangian method for total variation minimization. Comp. Appl. Math. 38, 50 (2019). https://doi.org/10.1007/s40314-019-0787-7

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-019-0787-7

Keywords

Mathematics Subject Classification

Navigation