×

A novel support vector machine with generalized pinball loss for uncertain data classification. (English) Zbl 1536.68011

Summary: In real-world problems, data suffer from measurement errors, data staleness, and repeated measurements, which make data uncertain. To consider the uncertainty of data, the concept of giving each data feature as a multidimensional Gaussian distribution has been utilized. In 2021, support vector machines with an \(\epsilon\) insensitive zone pinball loss (UPinSVMs) for uncertain data classification was proposed, where \(\epsilon\) is a positive number. The UPinSVMs bring noise insensitivity, stability for resampling, and increased model sparsity. However, the value of \(\epsilon\) is known to be specified. In order to improve the performance, we combine the generalized pinball loss (\((\epsilon_1, \epsilon_2)\)-Mod-Pin-SVM) into the uncertain classification, term as UGPinSVMs, where \(\epsilon_1\), \(\epsilon_2\) are two positive numbers. The generalized pinball loss is an optimal insensitive zone pinball loss and is an extension of existing loss functions that also addresses the issues of noise insensitivity and resampling instability. We solve the primal quadratic programming problems by transforming individuals into unconstrained optimization using an efficient stochastic gradient descent algorithm. More specially, we have introduced verified theorems that are related to our approaches and investigated scatter minimization. The results from several benchmark datasets show that our model outperforms the existing classifier in terms of accuracy and statistical analysis. Furthermore, the application of our framework to the crop recommendation dataset is also examined.
© 2023 John Wiley & Sons Ltd.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
62H30 Classification and discrimination; cluster analysis (statistical aspects)
68Q25 Analysis of algorithms and problem complexity
90C20 Quadratic programming
90C90 Applications of mathematical programming
Full Text: DOI

References:

[1] C.Cortes and V. N.Vapnik, Support‐vector networks, Mach. Learn.20 (1995), no. 3, 273-297. · Zbl 0831.68098
[2] V. N.Vapnik, An overview of statistical learning theory, IEEE Trans. Neural Netw.10 (1999), 988-999.
[3] D.Tao, X.Tang, and X.Wu, Asymmetric bagging and random subspace for support vector machines‐based relevance feedback in image retrieval, IEEE Trans. Pattern Anal. Mach. Intell.28 (2008), 1088-1099.
[4] S.Anupam and A. K.Kar, Phishing website detection using support vector machines and nature‐inspired optimization algorithms, Telecommun. Syst.76 (2021), 17-32.
[5] V.Parameswari and S.Pushpalatha, Human activity recognition using SVM and deep learning, European J. Mole. Clin. Med.7 (2020), no. 4, 1984-1990.
[6] Z.Tian, Backtracking search optimization algorithm‐based least square support vector machine and its applications, Eng. Appl. Artif. Intell.94 (2020), 103801, DOI 10.1016/j.engappai.2020.103801.
[7] C.Wang, Y.Dong, Y.Xia, G.Li, O. S.Martínez, and R. G.Crespo, Management and entrepreneurship management mechanism of college students based on support vector machine algorithm, Comput. Intell.38 (2022), 842-854, DOI 10.1111/coin.12430.
[8] H.Wang, Y.Shao, S.Zhou, C.Zhang, and N.Xiu, Support vector machine classifier via
[( {l}_{0/1} \]\) soft‐margin loss, IEEE Trans. Pattern Anal. Mach. Intell.44 (2021), 7253-7265, DOI 10.1109/TPAMI.2021.3092177.
[9] H.‐J.Xing and Z.‐C.He, Adaptive loss function based least squares one‐class support vector machine, Pattern Recogn. Lett.156 (2022), 174-182, DOI 10.1016/j.patrec.2022.03.009.
[10] P.Anand, R.Rastogi, and S.Chandra, A new asymmetric
[( \epsilon \]\)‐insensitive pinball loss function based support vector quantile regression model, Appl. Soft Comput.94 (2020), 106473, DOI 10.1016/j.asoc.2020.106473.
[11] X.Huang, L.Shi, and J. A.Suykens, Support vector machine classifier with pinball loss, IEEE Trans. Pattern Anal. Mach. Intell.35 (2014), no. 6, 984-997.
[12] R.Reshma, R.Khemchandani, A.Pal, and S.Chandra, Generalized pinball loss SVMs, Neurocomputing10 (2018), 322.
[13] K. K.Sharma and A.Seal, Outlier‐robust multi‐view clustering for uncertain data, Knowl.‐Based Syst.211 (2021), 106567, DOI 10.1016/j.knosys.2020.106567.
[14] R.Pelissari, M. C.Oliveira, A. J.Abackerli, S.Ben‐Amor, and M. R. P.Assumpção, Techniques to model uncertain input data of multi‐criteria decision‐making problems: a literature review, Int. Trans. Oper. Res.28 (2021), no. 2, 523-559, DOI 10.1111/itor.12598. · Zbl 07768056
[15] Z.Xie, Y.Xu, and Q.Hu, Uncertain data classification with additive kernel support vector machine, Data Knowl. Eng.117 (2018), 87-97.
[16] C.Tzelepis, V.Mezaris, and I.Patras, Linear maximum margin classifier for learning from uncertain data, IEEE Trans. Pattern Anal. Mach. Intell.40 (2018), 2948-2962.
[17] Z.Liang and L.Zhang, Support vector machines with the
[( \epsilon \]\)‐insensitive pinball loss function for uncertain data classification, Neurocomputing457 (2021), 117-127.
[18] M.Abramowitz and I. A.Stegun, Handbook of mathematical functions, 1972. Details on how published. · Zbl 0543.33001
[19] H.Robbins and S.Monro, A stochastic approximation method, Springer (1985), 102-109.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.