×

On lower complexity bounds for large-scale smooth convex optimization. (English) Zbl 1304.65155

Summary: We derive lower bounds on the black-box oracle complexity of large-scale smooth convex minimization problems, with emphasis on minimizing smooth (with Hölder continuous, with a given exponent and constant, gradient) convex functions over high-dimensional \(\| \cdot \|_p\)-balls, \(1 \leq p \leq \infty\). Our bounds turn out to be tight (up to logarithmic in the design dimension factors), and can be viewed as a substantial extension of the existing lower complexity bounds for large-scale convex minimization covering the nonsmooth case and the “Euclidean” smooth case (minimization of convex functions with Lipschitz continuous gradients over Euclidean balls). As a byproduct of our results, we demonstrate that the classical conditional gradient algorithm is near-optimal, in the sense of information-based complexity theory, when minimizing smooth convex functions over high-dimensional \(\| \cdot \|_\infty\)-balls and their matrix analogies-spectral norm balls in the spaces of square matrices.

MSC:

65K05 Numerical mathematical programming methods
90C25 Convex programming
65Y20 Complexity and performance of numerical algorithms

References:

[2] Cox, B.; Juditsky, A.; Nemirovski, A., Dual subgradient algorithms for large-scale nonsmooth learning problems, Math. Program. Ser. B, 1-38 (2013)
[3] Demyanov, V.; Rubinov, A., Approximate Methods in Optimization Problems (1970), American Elsevier · Zbl 0217.46203
[4] Frank, M.; Wolfe, P., An algorithm for quadratic programming, Nav. Res. Logist. Q., 3, 95-110 (1956)
[5] Garber, D.; Hazan, E., Playing non-linear games with linear oracles, (2013 IEEE 54th Annual Symposium on Foundations of Computer Science (FOCS) (2013), IEEE), 420-428
[6] Harchaoui, Z.; Juditsky, A.; Nemirovski, A., Conditional gradient algorithms for norm-regularized smooth convex optimization, Math. Program. Ser. A, 1-38 (2014)
[8] Hiriart-Urruty, J.-B.; Lemaréchal, C., Fundamentals of Convex Analysis (2001), Springer Verlag: Springer Verlag Heidelberg · Zbl 0998.49001
[9] Jaggi, M., Sparse convex optimization methods for machine learning (October 2011), ETH Zurich, (Ph.D. thesis)
[11] Khachiyan, L.; Nemirovski, A.; Nesterov, Y., Optimal methods for the solution of large-scale convex programming problems, (Elster, K.-H., Modern Mathematical Methods in Optimization (1993), Academie Verlag: Academie Verlag Berlin)
[13] Nemirovski, A., On optimality of Krylov’s information when solving linear operator equations, J. Complexity, 7, 2, 121-130 (1991) · Zbl 0757.47010
[14] Nemirovski, A., Information-based complexity of linear operator equations, J. Complexity, 8, 2, 153-175 (1992) · Zbl 0758.65044
[16] Nemirovskii, A.; Nesterov, Y., Optimal methods of smooth convex optimization, Zh. Vychisl. Mat. Mat. Fiz., 25, 3, 356-369 (1985), (in Russian) · Zbl 0591.90072
[17] Nemirovski, A.; Yudin, D., Problem Complexity and Method Efficiency in Optimization (1983), Wiley-Interscience · Zbl 0501.90062
[18] Nesterov, Y., Introductory Lectures on Convex Optimization: A Basic Course (2004), Springer: Springer Netherlands · Zbl 1086.90045
[19] Nesterov, Y., Universal gradient methods for convex optimization problems, Math. Program. Ser. A, 1-24 (2014)
[20] Pisier, G., The Volume of Convex Bodies and Banach Space Geometry (1989), Cambridge University Press · Zbl 0698.46008
[21] Pshenichnyj, B.; Danilin, Y., Numerical Methods in Extremal Problems (1978), Mir · Zbl 0384.65028
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.