×

Functional linear regression with Huber loss. (English) Zbl 07622343

Summary: In this paper, we consider the robust regression problem associated with Huber loss in the framework of functional linear model and reproducing kernel Hilbert spaces. We propose an Ivanov regularized empirical risk minimization estimation procedure to approximate the slope function of the linear model in the presence of outliers or heavy-tailed noises. By appropriately tuning the scale parameter of the Huber loss, we establish explicit rates of convergence for our estimates in terms of excess prediction risk under mild assumptions. Our study in the paper justifies the efficiency of Huber regression for functional data from a theoretical viewpoint.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
62J05 Linear regression; mixed models

Software:

fda (R); robustbase
Full Text: DOI

References:

[1] Aronszajn, N., Theory of reproducing kernels, Trans. Am. Math. Soc., 68, 337-404 (1950) · Zbl 0037.20701
[2] Cai, T.; Hall, P., Prediction in function linear regression, Ann. Stat., 34, 2159-2179 (2006) · Zbl 1106.62036
[3] Cai, T.; Yuan, M., Minimax and adaptive prediction for function linear regression, J. Am. Stat. Assoc., 107, 1201-1216 (2012) · Zbl 1443.62196
[4] Caponnetto, A.; De Vito, E., Optimal rates for the regularized least-squares algorithm, Found. Comput. Math., 7, 331-368 (2007) · Zbl 1129.68058
[5] Catoni, O., Challenging the empirical mean and empirical variance: a deviation study, Ann. Inst. Henri Poincaré Probab. Stat., 48, 4, 1148-1185 (2012) · Zbl 1282.62070
[6] Cucker, F.; Smale, S., Best choices for regularization parameters in learning theory: on the bias-variance problem, Found. Comput. Math., 2, 413-428 (2002) · Zbl 1057.68085
[7] Cucker, F.; Zhou, D. X., Learning Theory: an Approximation Theory Viewpoint (2007), Cambridge University Press: Cambridge University Press Cambridge · Zbl 1274.41001
[8] De Vito, E.; Caponnetto, A.; Rosasco, L., Model selection for regularized least-squares algorithm in learning theory, Found. Comput. Math., 5, 59-85 (2005) · Zbl 1083.68106
[9] Feng, Y.; Wu, Q., Learning under \((1 + \epsilon)\)-moment conditions, Appl. Comput. Harmon. Anal., 49, 2, 495-520 (2020) · Zbl 1442.62150
[10] Feng, Y.; Wu, Q., A statistical learning assessment of Huber regression, J. Approx. Theory, 273, Article 105660 pp. (2022) · Zbl 1504.62092
[11] Ferraty, F.; Vieu, P., Nonparametric Functional Data Analysis: Methods, Theory, Applications and Implementations (2006), Springer: Springer New York · Zbl 1119.62046
[12] Guo, Z. C.; Xiang, D. H.; Guo, X.; Zhou, D. X., Thresholded spectral algorithms for sparse approximations, Anal. Appl., 15, 433-455 (2017) · Zbl 1409.68232
[13] Hall, P.; Horowitz, G. L., Methodology and convergence rates for function linear regression, Ann. Stat., 35, 70-91 (2007) · Zbl 1114.62048
[14] Hampel, F. R.; Ronchetti, E. M.; Rousseeuw, P. J.; Stahel, W. A., Robust Statistics: The Approach Based on Influence Functions (2011), Wiley: Wiley New York
[15] Hsing, T.; Eubank, R., Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators (2015), Wiley: Wiley New York · Zbl 1338.62009
[16] Huber, P. J., Robust estimation of a location parameter, Ann. Math. Stat., 35, 1, 73-101 (1964) · Zbl 0136.39805
[17] Huber, P. J., Robust Statistics (1981), Wiley: Wiley New York · Zbl 0536.62025
[18] Kokoszka, P.; Reimherr, M., Introduction to Functional Data Analysis (2017), CRC Press · Zbl 1411.62004
[19] Koltchinskii, V., Oracle Inequalities in Empirical Risk Minimization and Sparse Recovery Problems (2011), Springer: Springer New York · Zbl 1223.91002
[20] Li, Y.; Tsing, T., On rates of convergence in function linear regression, J. Multivar. Anal., 98, 1782-1804 (2007) · Zbl 1130.62035
[21] Lian, H., Minimax prediction for function linear regression with functional responses in reproducing kernel Hilbert spaces, J. Multivar. Anal., 140, 395-402 (2015) · Zbl 1327.62216
[22] Maronna, R.; Martin, D.; Yohai, V., Robust Statistics: Theory and Methods (2006), John Wiley & Sons: John Wiley & Sons Chichester · Zbl 1094.62040
[23] Massart, P., About the constants in Talagrand’s concentration inequalities for empirical processes, Ann. Probab., 28, 2, 863-884 (2000) · Zbl 1140.60310
[24] Paulsen, V.; Raghupathi, M., An Introduction to the Theory of Reproducing Kernel Hilbert Spaces (2016), Cambridge University Press · Zbl 1364.46004
[25] Ramsay, J. O.; Silverman, B. W., Applied Functional Data Analysis: Methods and Case Studies (2002), Springer: Springer New York · Zbl 1011.62002
[26] Ramsay, J. O.; Silverman, B. W., Functional Data Analysis (2005), Springer: Springer New York · Zbl 1079.62006
[27] Rosassco, L.; Vito, E. D.; Caponnetto, A.; Pinna, M.; Verri, A., Are loss function all the same?, Neural Comput., 16, 1063-1076 (2004) · Zbl 1089.68109
[28] Rousseeuw, P. J.; Leroy, A. M., Robust Regression and Outlier Detection (2005), John Wiley & Sons
[29] Smale, S.; Zhou, D. X., Learning theory estimates via integral operators and their approximations, Constr. Approx., 26, 153-172 (2007) · Zbl 1127.68088
[30] Sun, Q.; Zhou, W.; Fan, J., Adaptive Huber regression, J. Am. Stat. Assoc., 115, 254-265 (2020) · Zbl 1437.62250
[31] Talagrand, M., Concentration of measure and isoperimetric inequalities in product spaces, Publ. Math. IHÉS, 81, 1, 73-205 (1995) · Zbl 0864.60013
[32] Talagrand, M., New concentration inequalities in product spaces, Invent. Math., 126, 3, 505-563 (1996) · Zbl 0893.60001
[33] Tong, H.; Ng, M., Analysis of regularized least squares for functional linear regression model, J. Complex., 49, 85-94 (2018) · Zbl 1402.62158
[34] van der Vaart, A.; Wellner, J. A., Weak Convergence and Empirical Processes (1996), Springer: Springer New York · Zbl 0862.60002
[35] Yao, F.; Müller, H. G.; Wang, J. L., Functional linear regression analysis for longitudinal data, Ann. Stat., 33, 2873-2903 (2005) · Zbl 1084.62096
[36] Yuan, M.; Cai, T., A reproducing kernel Hilbert space approach to function linear regression, Ann. Stat., 38, 3412-3444 (2010) · Zbl 1204.62074
[37] Zhou, D. X., Deep distributed convolutional neural networks: universality, Anal. Appl., 16, 6, 895-919 (2018) · Zbl 1442.68214
[38] Zhou, W.-X.; Bose, K.; Fan, J.; Liu, H., A new perspective on robust m-estimation: finite sample theory and applications to dependence-adjusted multiple testing, Ann. Stat., 46, 5, 1904-1931 (2018) · Zbl 1409.62154
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.