×

A general approach to convergence properties of some methods for nonsmooth convex optimization. (English) Zbl 0910.90228

Summary: Based on the notion of the \(\varepsilon\)-subgradient, we present a unified technique to establish convergence properties of several methods for nonsmooth convex minimization problems. Starting from the technical results, we obtain the global convergence of: (i) the variable metric proximal methods presented by Bonnans, Gilbert, Lemaréchal, and Sagastizábal, (ii) some algorithms proposed by Correa and Lemaréchal, and (iii) the proximal point algorithm given by Rockafellar. In particular, we prove that the Rockafellar-Todd phenomenon does not occur for each of the above-mentioned methods. Moreover, we explore the convergence rate of \(\{\| x_k\|\}\) and \(\{f(x_k)\}\) when \(\{x_k\}\) is unbounded and \(\{f(x_k)\}\) is bounded for the non-smooth minimization methods (i), (ii), and (iii).

MSC:

90C25 Convex programming
90C30 Nonlinear programming
90C33 Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming)
49J52 Nonsmooth analysis