×

Smoothing splines and shape restrictions. (English) Zbl 0932.62051

A regression model \(Y_i=m_0(x_i)+\varepsilon_i\) is considered, where \(m_0: [0,1]\to R\) is an unknown regression function, \(\varepsilon_i\) are independent errors with \(E\varepsilon_i=0\), and \(x_1\leq x_2\leq\dots\leq x_n\) are deterministic design points. It is supposed that \(m_0\) belongs to the class \[ M_{k,r}=\{m: m^{r-1} \text{exists a.s. and is monotonous}, |m^{(r-1)}|\leq D, \]
\[ m^{(k-1)} \text{exists and is absolutely continuous with} \int(m^{(k)}(x))^2 dx<\infty\}. \] A constrained spline estimator \(\hat m_{n,D}^{CS}\) is proposed for the estimation of \(m_0\), where \[ \hat m_{n,D}^{CS}=\text{arg min}_{m\in M_{k,r}} \{n^{-1}\sum_{i=1}^n (Y_i-m(x_i))^2 +\lambda_n\int_0^1(m^{(k)}(x))^2dx\}, \] and \(\lambda_n\) is some penalty sequence (may be random). It is shown that if \(\lambda_n\sim n^{-2p/(2p+1)}\), then in the \(L_2\) norm \(\|g\|_n^2=n^{-1}\sum_{j=1}^n g^2(x_j)\) the estimator has the unimprovable convergence rate \[ \|\hat m_{n,D}^{CS}-m_0\|_n=O_p(n^{-p/(2p+1)}),\;\text{where} p=\max (k,r). \] If \(r<k,\) then, under some additional conditions, \[ P(\hat m_{n,D}^{CS}(x)=m_n^S(x),\;x\in[0,1])\to 1, \] where \(\hat m_n^S\) is the unconstrained spline estimator. But if \(r=k=2\), the differences \(\hat m_n^{CS}-\hat m_n^S\) do not vanish asymptotically. It is also shown that \(\hat m_n^{CS}\) can be considered as a projection of \(\hat m_n^S \) on \(M_{k,r}\) in some Sobolev-type norm.

MSC:

62G08 Nonparametric regression and quantile regression
62G20 Asymptotic properties of nonparametric inference
62J02 General nonlinear regression