×

Incremental Huber-support vector regression based online robust parameter design. (English) Zbl 07850718

Summary: In the response surface based RPD, the optimal setting of the controllable factor is highly dependent on the accuracy of the response surface. Classically, in order to improve the accuracy of the response surface, it is necessary to add more samples. The larger the number of samples, the higher the accuracy. Traditional RPD usually uses a one-shot modeling method to construct a response surface. Whenever the number of samples increases, all samples need to be learned from the beginning to rebuild the response surface. However, The one-shot modeling method significantly increases the time of model training and the complexity of model training. We present an incremental strategy to build response models. Our solution is based on the Huber-support vector regression machine. In this article, the incremental Huber-SVR model is proposed to construct the response surface in robust parameter design. The proposed algorithm can continuously integrate new sample information into the already built model. In incremental HSVR-RPD, we can use the optimal settings of the previous controllable factors, the currently observed noise factor and the corresponding response to improve the accuracy of the response surface, so as to obtain more reliable recommended settings in the next stage.

MSC:

62-XX Statistics
Full Text: DOI

References:

[1] Balasundaram, S, and Meena, Y.. 2019. Robust support vector regression in primal with asymmetric Huber loss. Neural Processing Letters49 (3):1399-431. doi:.
[2] Balasundaram, S, and Prasad, S. C.. 2020. Robust twin support vector regression based on Huber loss function. Neural Computing and Applications32 (15):11285-309. doi:.
[3] Borah, P, and Gupta, D.. 2020. Functional iterative approaches for solving support vector classification problems based on generalized Huber loss. Neural Computing and Applications32 (13):9245-65. doi:.
[4] Dixon, L. C. W.1978. The global optimization problem. An introduction. Toward Global Optimization2:1-15.
[5] Gupta, D., Hazarika, B. B., and Berlin, M.. 2020. Robust regularized extreme learning machine with asymmetric Huber loss function. Neural Computing and Applications32 (16):12971-98. doi:.
[6] Gupta, U, and Gupta, D.. 2019. An improved regularization based Lagrangian asymmetric ν-twin support vector regression using pinball loss function. Applied Intelligence49 (10):3606-27. doi:.
[7] Gupta, U, and Gupta, D.. 2021a. Least squares large margin distribution machine for regression. Applied Intelligence51 (10):7058-93. doi:.
[8] Gupta, U, and Gupta, D.. 2021b. On regularization based twin support vector regression with Huber loss. Neural Processing Letters53 (1):459-515. doi:.
[9] Hazarika, B. B., Gupta, D., and Borah, P.. 2021. An intuitionistic fuzzy kernel ridge regression classifier for binary classification. Applied Soft Computing112:107816. doi:.
[10] He, Y., He, Z., Lee, D. H., Kim, K. J., Zhang, L., and Yang, X.. 2017. Robust fuzzy programming method for MRO problems considering location effect, dispersion effect and model uncertainty. Computers & Industrial Engineering105:76-83. doi:.
[11] Jiang, T, and Zhou, X.. 2018. Gradient/hessian-enhanced least square support vector regression. Information Processing Letters134:1-8. doi:. · Zbl 1476.62149
[12] Lam, C. Q.2008. Sequential adaptive designs in computer experiments for response surface model fit [Ph.D. thesis]. The Ohio State University.
[13] Lehman, J. S.2002. Sequential design of computer experiments for robust parameter design [Ph.D. thesis]. The Ohio State University.
[14] Lin, D. K, and Tu, W.. 1995a. Dual response surface optimization. Journal of Quality Technology27 (1):34-9. doi:.
[15] Lin, D. K. J, and Tu, W.. 1995b.Dual response surface optimization. Journal of Quality Technology 27 (1):34-9. doi:.
[16] Ma, J., Theiler, J., and Perkins, S.. 2003. Accurate on-line support vector regression. Neural Computation15 (11):2683-703. doi:. · Zbl 1085.68640
[17] Myers, R. H., Montgomery, D. C., and Anderson-Cook, C. M.. 2016. Response surface methodology: Process and product optimization using designed experiments. Hoboken, New Jersey: John Wiley & Sons. · Zbl 1332.62004
[18] Ouyang, L., Chen, J., Ma, Y., Park, C., and Jin, J.. 2020. Bayesian closed-loop robust process design considering model uncertainty and data quality. IISE Transactions52 (3):288-300. doi:.
[19] Ouyang, L., Ma, Y., Byun, J. H., Wang, J., and Tu, Y.. 2016. An interval approach to robust design with parameter uncertainty. International Journal of Production Research54 (11):3201-15. doi:.
[20] Ouyang, L., Zhu, S., Ye, K., Park, C., and Wang, M.. 2022. Robust Bayesian hierarchical modeling and inference using scale mixtures of normal distributions. IISE Transactions54 (7):659-71.
[21] Su, C. T.2013. Quality engineering: Off-line methods and applications. Boca Raton: CRC press.
[22] Taguchi, G.1986. Introduction to quality engineering: Designing quality into products and processes.
[23] Vanli, O. A., Zhang, C., and Wang, B.. 2013. An adaptive Bayesian approach for robust parameter design with observable time series noise factors. IIE Transactions45 (4):374-90. doi:.
[24] Vining, G, and Myers, R.. 1990a. Combining Taguchi and response surface philosophies – a dual response approach. Journal of Quality Technology22 (1):38-45. doi:.
[25] Vining, G. G, and Myers, R. H.. 1990b. Combining Taguchi and response surface philosophies: A dual response approach. Journal of Quality Technology22 (1):38-45. doi:.
[26] Welch, W. J.1985. Aced: Algorithms for the construction of experimental designs. The American Statistician39 (2):146- doi:.
[27] Wu, C. J, and Hamada, M. S.. 2011. Experiments: Planning, analysis, and optimization (vol. 552). Hoboken, New Jersey: John Wiley & Sons.
[28] Yang, S., Wang, J., and Ma, Y.. 2021. Online robust parameter design considering observable noise factors. Engineering Optimization53 (6):1024-43. doi:.
[29] Zhou, X, and Jiang, T.. 2016a. Metamodel selection based on stepwise regression. Structural and Multidisciplinary Optimization54 (3):641-57. doi:.
[30] Zhou, X. J, and Jiang, T.. 2016b. Enhancing least square support vector regression with gradient information. Neural Processing Letters43 (1):65-83. doi:.
[31] Zhou, X. J, and Ma, Y. Z.. 2013. A study on SMO algorithm for solving - SVR with non-PSD kernels. Communications in Statistics - Simulation and Computation42 (10):2175-96. doi:. · Zbl 1264.93276
[32] Zhou, X. J., Ma, Y. Z., and Li, X. F.. 2011. Ensemble of surrogates with recursive arithmetic average. Structural and Multidisciplinary Optimization44 (5):651-71. doi:.
[33] Zhou, X. J., Ma, Y. Z., Tu, Y. L., and Feng, Y.. 2013. Ensemble of surrogates for dual response surface modeling in robust parameter design. Quality and Reliability Engineering International29 (2):173-97. doi:.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.