×

Sum-of-squares relaxations in robust DC optimization and feature selection. (English) Zbl 1536.90142

Summary: This paper presents sum-of-squares (SOS) relaxation results to a difference-of-convex-max (DC-max) optimization involving SOS-convex polynomials in the face of constraint data uncertainty and their applications to robust feature selection. The main novelty of the present work in relation to the recent research in robust convex and DC optimization is the derivation of a new form of minimally exact SOS relaxations for robust DC-max problems. This leads to the identification of broad classes of robust DC-max problems with finitely exact SOS relaxations that are numerically tractable. They allow one to find the optimal values of these classes of DC-max problems by solving a known finite number of semi-definite programs (SDPs) for certain concrete cases of commonly used uncertainty sets in robust optimization. In particular, we derive relaxation results for a class of robust fractional programs. Also, we provide a finitely exact SDP relaxation for a DC approximation problem of an NP-hard robust feature selection model which gives computable upper bounds for the global optimal value.

MSC:

90C17 Robustness in mathematical programming
90C26 Nonconvex programming, global optimization
90C32 Fractional programming
90C22 Semidefinite programming

Software:

CVX
Full Text: DOI

References:

[1] Ahmadi, AA; Parrilo, PA, A complete characterization of the gap between convexity and SOS-convexity, SIAM J. Optim., 23, 2, 811-833 (2013) · Zbl 1295.52009
[2] Ben-Tal, A.; El Ghaoui, L.; Nemirovski, A., Robust Optimization (2009), Princeton: Princeton University Press, Princeton · Zbl 1221.90001
[3] Ben-Tal, A.; Nemirovski, A., Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications (2001), Philadelphia: SIAM, Philadelphia · Zbl 0986.90032
[4] Bradley, P. S., Mangasarian, O. L.: Feature selection via concave minimization and support vector machines. In: Shavlik, J. W. (ed.) International Conference on Machine Learning (ICML), vol. 98, pp. 82-90 (1998)
[5] Bradley, PS; Mangasarian, OL; Street, WN, A complete characterization of the gap between convexity and SOS-convexity, INFORMS J. Comput., 10, 2, 209-217 (1998) · Zbl 1034.90529
[6] Bruckstein, AM; Donoho, DL; Elad, M., From sparse solutions of systems of equations to sparse modeling of signals and images, SIAM Rev., 51, 1, 34-81 (2009) · Zbl 1178.68619
[7] Cervantes, J.; Garcia-Lamont, F.; Rodrıguez-Mazahua, L.; Lopez, A., A comprehensive survey on support vector machine classification: applications, challenges and trends, Neurocomputing, 408, 189-215 (2020)
[8] Chieu, NH; Chuong, TD; Jeyakumar, V.; Li, G., A copositive Farkas lemma and minimally exact conic relaxations for robust quadratic optimization with binary and quadratic constraints, Oper. Res. Lett., 47, 6, 530-536 (2019) · Zbl 1455.90125
[9] Chieu, NH; Feng, JW; Gao, W.; Li, G.; Wu, D., SOS-convex semialgebraic programs and its applications to robust optimization: a tractable class of nonsmooth convex optimization, Set-Valued Var. Anal., 26, 305-326 (2018) · Zbl 1393.90083
[10] Dinkelbach, W., On nonlinear fractional programming, Manage. Sci., 13, 7, 492-498 (1967) · Zbl 0152.18402
[11] Dunbar, M.; Murray, JM; Cysique, LA; Brew, BJ; Jeyakumar, V., Simultaneous classification and feature selection via convex quadratic programming with application to HIV-associated neurocognitive disorder assessment, Eur. J. Oper. Res., 206, 2, 470-478 (2010) · Zbl 1188.90235
[12] Gaudioso, M.; Gorgone, E.; Hiriart-Urruty, JB, Feature selection in SVM via polyhedral k-norm, Optim. Lett., 14, 1, 19-36 (2020) · Zbl 1433.90133
[13] Gotoh, JY; Takeda, A.; Tono, K., DC formulations and algorithms for sparse optimization problems, Math. Program., 169, 141-176 (2018) · Zbl 06869181
[14] Grant, M., Boyd, S.: CVX: Matlab software for disciplined convex programming (2011). http://cvxr.com/cvx
[15] Harada, R.; Kuroiwa, D., Lagrange-type duality in DC programming, J. Math. Anal. Appl., 418, 1, 415-424 (2014) · Zbl 1314.90064
[16] Helton, JW; Nie, J., Semi-definite representation of convex sets, Math. Program., 120, 2, 21-64 (2010) · Zbl 1192.90143
[17] Hiriart-Urruty, J. B., Lemarechal, C.: Fundamentals of Convex Analysis. Springer Science & Business Media (2004) · Zbl 0998.49001
[18] Horel, AE, Application of ridge analysis to regression problems, Chem. Eng. Prog., 58, 54-59 (1962)
[19] Jeyakumar, V.; Li, G., A new class of alternative theorems for SOS-convex inequalities and robust optimization, Appl. Anal., 94, 1, 56-74 (2015) · Zbl 1338.90302
[20] Jeyakumar, V.; Lee, GM; Linh, NTH, Generalized Farkas’ lemma and gap-free duality for minimax DC optimization with polynomials and robust quadratic optimization, J. Glob. Optim., 64, 679-702 (2016) · Zbl 1346.90808
[21] Jeyakumar, V.; Li, G., Exact SDP relaxations for classes of nonlinear semi-definite programming problems, Oper. Res. Lett., 40, 6, 529-536 (2012) · Zbl 1287.90047
[22] Jeyakumar, V.; Li, G.; Vicente-Perez, J., Robust SOS-convex polynomial optimization problems: exact SDP relaxations, Optim. Lett., 9, 1-18 (2015) · Zbl 1338.90452
[23] Jeyakumar, V.; Vicente-Perez, J., Dual semi-definite programs without duality gaps for a class of convex minimax programs, J. Optim. Theory Appl., 162, 735-753 (2014) · Zbl 1312.90087
[24] Lasserre, JB, An Introduction to Polynomial and Semi-Algebraic Optimization (2015), Cambridge: Cambridge University Press, Cambridge · Zbl 1320.90003
[25] Lasserre, JB, Convexity in semi-algebraic geometry and polynomial optimization, SIAM J. Optim., 19, 4, 1995-2014 (2009) · Zbl 1181.90216
[26] Le Thi, HA; Le, HM; Pham Dinh, T., Feature selection in machine learning: an exact penalty approach using a difference of convex function algorithm, Mach. Learn., 101, 163-186 (2015) · Zbl 1343.68201
[27] Le Thi, H. A., Pham Dinh, T.: Open issues and recent advances in DC programming and DCA. J. Glob. Optim. 1-58 (2023)
[28] Le Thi, HA; Vo, XT; Pham Dinh, T., Feature selection for linear SVMs under uncertain data: robust optimization based on difference of convex functions algorithms, Neural Netw., 59, 36-50 (2014) · Zbl 1327.90236
[29] Lee, JH; Lee, GM, On minimizing difference of an SOS-convex polynomial and a support function over an SOS-concave matrix polynomial constraint, Math. Program., 169, 177-198 (2018) · Zbl 1390.90518
[30] Martınez-Legaz, JE; Volle, M., Duality in DC programming: the case of several DC constraints, J. Math. Anal. Appl., 237, 2, 657-671 (1998) · Zbl 0946.90064
[31] Rockafellar, RT, Convex Analysis (1970), Princeton: Princeton University Press, Princeton · Zbl 0193.18401
[32] Su, CT; Yang, CH, Feature selection for the SVM: an application to hypertension diagnosis, Expert Syst. Appl., 34, 1, 754-763 (2008)
[33] Tibshirani, R., Regression shrinkage and selection via the Lasso, J. R. Stat. Soc. B, 58, 1, 267-288 (1996) · Zbl 0850.62538
[34] Woolnough, D., Jeyakumar, N., Li, G., Loy, C. T., Jeyakumar, V.: Robust optimization and data classification for characterization of Huntington disease onset via duality methods. J. Optim. Theory Appl. 1-27 (2022) · Zbl 1492.90113
[35] Zhang, W.; Hong, B.; Liu, W.; Ye, J.; Cai, D.; He, X.; Wang, J., Scaling up sparse support vector machines by simultaneous feature and sample reduction, J. Mach. Learn. Res., 20, 121, 1-39 (2019) · Zbl 1434.68415
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.