Efficient and fast spline-backfitted kernel smoothing of additive models |
| |
Authors: | Jing Wang Lijian Yang |
| |
Institution: | (1) Department of Mathematics, Statistics, and Computer Science, University of Illinois at Chicago, Chicago, IL 60607, USA;(2) Department of Statistics and Probability, Michigan State University, East Lansing, MI 48824, USA |
| |
Abstract: | A great deal of effort has been devoted to the inference of additive model in the last decade. Among existing procedures,
the kernel type are too costly to implement for high dimensions or large sample sizes, while the spline type provide no asymptotic
distribution or uniform convergence. We propose a one step backfitting estimator of the component function in an additive
regression model, using spline estimators in the first stage followed by kernel/local linear estimators. Under weak conditions,
the proposed estimator’s pointwise distribution is asymptotically equivalent to an univariate kernel/local linear estimator,
hence the dimension is effectively reduced to one at any point. This dimension reduction holds uniformly over an interval
under assumptions of normal errors. Monte Carlo evidence supports the asymptotic results for dimensions ranging from low to
very high, and sample sizes ranging from moderate to large. The proposed confidence band is applied to the Boston housing
data for linearity diagnosis.
Supported in part by NSF awards DMS 0405330, 0706518, BCS 0308420 and SES 0127722. |
| |
Keywords: | Bandwidths B spline Knots Local linear estimator Nadaraya-Watson estimator Nonparametric regression |
本文献已被 SpringerLink 等数据库收录! |
|