×

Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points. (English) Zbl 1327.41004

Summary: We study the accuracy of the discrete least-squares approximation on a finite-dimensional space of a real-valued target function from noisy pointwise evaluations at independent random points distributed according to a given sampling probability measure. The convergence estimates are given in mean-square sense with respect to the sampling measure. The noise may be correlated with the location of the evaluation and may have nonzero mean (offset). We consider both cases of bounded or square-integrable noise/offset. We prove conditions between the number of sampling points and the dimension of the underlying approximation space that ensure a stable and accurate approximation. Particular focus is on deriving estimates in probability within a given confidence level. We analyze how the best approximation error and the noise terms affect the convergence rate and the overall confidence level achieved by the convergence estimate. The proofs of our convergence estimates in probability use arguments from the theory of large deviations to bound the noise term. Finally we address the particular case of multivariate polynomial approximation spaces with any density in the beta family, including uniform and Chebyshev.

MSC:

41A10 Approximation by polynomials
41A25 Rate of convergence, degree of approximation
41A50 Best approximation, Chebyshev systems
41A63 Multidimensional problems
62G08 Nonparametric regression and quantile regression
65M70 Spectral, collocation and related methods for initial value and initial-boundary value problems involving PDEs
Full Text: DOI

References:

[1] Ahlswede, R.; Winter, A., Strong converse for identification via quantum channels, IEEE Trans. Inform. Theory, 48, 569-579 (2002) · Zbl 1071.94530
[2] Binev, P.; Cohen, A.; Dahmen, W.; DeVore, R., Universal algorithms for learning theory—Part II: piecewise polynomial functions, Constr. Approx., 26, 127-152 (2007) · Zbl 1191.62067
[3] Binev, P.; Cohen, A.; Dahmen, W.; DeVore, R.; Temlyakov, V., Universal algorithms for learning theory—Part I: piecewise constant functions, J. Mach. Learn. Res., 6, 1297-1321 (2005) · Zbl 1191.62068
[4] Chkifa, A.; Cohen, A.; Migliorati, G.; Nobile, F.; Tempone, R., Discrete least squares polynomial approximation with random evaluations—application to parametric and stochastic elliptic PDEs, ESAIM Math. Model. Numer. Anal., 49, 3, 815-837 (2015) · Zbl 1318.41004
[5] Cohen, A.; Davenport, M. A.; Leviatan, D., On the stability and accuracy of least squares approximations, Found. Comput. Math., 13, 819-834 (2013) · Zbl 1276.93086
[6] Cucker, F.; Smale, S., On the mathematical foundations of learning, Bull. Amer. Math. Soc., 39, 1, 1-49 (2001) · Zbl 0983.68162
[7] Cucker, F.; Zhou, D., Learning Theory: An Approximation Theory Viewpoint (2007), Cambridge University Press · Zbl 1274.41001
[8] Dembo, A.; Zeitouni, O., Large Deviations Techniques and Applications (1998), Springer · Zbl 0896.60013
[9] Ghanem, R. G.; Spanos, P. D., Stochastic Finite Elements: A Spectral Approach (1991), Springer-Verlag: Springer-Verlag New York · Zbl 0953.74608
[10] Györfi, L.; Kohler, M.; Krzyzak, A.; Walk, H., A Distribution-Free Theory of Nonparametric Regression (2002), Springer-Verlag · Zbl 1021.62024
[11] Le Maitre, O.; Knio, O. M., Spectral Methods for Uncertainty Quantification: With Applications to Computational Fluid Dynamics (2010), Springer · Zbl 1193.76003
[12] Migliorati, G., Polynomial approximation by means of the random discrete L2 projection and application to inverse problems for PDEs with stochastic data (2013), Dipartimento di Matematica “Francesco Brioschi”, Politecnico di Milano, and Centre de Mathématiques Appliquées, École Polytechnique: Dipartimento di Matematica “Francesco Brioschi”, Politecnico di Milano, and Centre de Mathématiques Appliquées, École Polytechnique Milano, Italy, Palaiseau, France, (Ph.D. thesis)
[13] Migliorati, G., Multivariate Markov-type and Nikolskii-type inequalities for polynomials associated with downward closed multi-index sets, J. Approx. Theory, 189, 137-159 (2015) · Zbl 1309.41002
[14] Migliorati, G.; Nobile, F., Analysis of discrete least squares on multivariate polynomial spaces with evaluations at low-discrepancy point sets, J. Complexity, 31, 4, 517-542 (2015) · Zbl 1320.62076
[15] Migliorati, G.; Nobile, F.; von Schwerin, E.; Tempone, R., Approximation of quantities of interest in stochastic PDEs by the random discrete \(L^2\) projection on polynomial spaces, SIAM J. Sci. Comput., 35, 3, A1440-A1460 (2013) · Zbl 1276.65004
[16] Migliorati, G.; Nobile, F.; von Schwerin, E.; Tempone, R., Analysis of discrete \(L^2\) projection on polynomial spaces with random evaluations, Found. Comput. Math., 14, 419-456 (2014) · Zbl 1301.41005
[17] Niyogi, P., The Informational Complexity of Learning (1998), Kluwer · Zbl 0976.68125
[18] Poggio, T.; Smale, S., The mathematics of learning: Dealing with data, Notices Amer. Math. Soc., 50, 537-544 (2003) · Zbl 1083.68100
[19] Temlyakov, V. N., Approximation in learning theory, Constr. Approx., 27, 33-74 (2008) · Zbl 05264756
[20] Tropp, J., User-friendly tail bounds for sums of random matrices, Found. Comput. Math., 12, 389-434 (2012) · Zbl 1259.60008
[21] Vapnik, V., Statistical Learning Theory (1998), John Wiley & Sons · Zbl 0935.62007
[22] Varadhan, S. R.S., Special invited paper: Large deviations, Ann. Probab., 36, 2, 397-419 (2008) · Zbl 1146.60003
[23] Zhou, T.; Narayan, A.; Xu, Z., Multivariate discrete least-squares approximations with a new type of collocation grid, SIAM J. Sci. Comput., 36, 5, A2401-A2422 (2014) · Zbl 1305.41012
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.