Nonparametric Knn estimation with monotone constraints Academic Article uri icon

abstract

  • 2017 Taylor & Francis Group, LLC. The K-nearest-neighbor (Knn) method is known to be more suitable in fitting nonparametrically specified curves than the kernel method (with a globally fixed smoothing parameter) when data sets are highly unevenly distributed. In this paper, we propose to estimate a nonparametric regression function subject to a monotonicity restriction using the Knn method. We also propose using a new convergence criterion to measure the closeness between an unconstrained and the (monotone) constrained Knn-estimated curves. This method is an alternative to the monotone kernel methods proposed by Hall and Huang(2001), and Du et al.(2013). We use a bootstrap procedure for testing the validity of the monotone restriction. We apply our method to the Job Market Matching data taken from Gan and Li(2016) and find that the unconstrained/constrained Knn estimators work better than kernel estimators for this type of highly unevenly distributed data.

published proceedings

  • ECONOMETRIC REVIEWS

author list (cited authors)

  • Li, Z., Liu, G., & Li, Q. i.

citation count

  • 12

complete list of authors

  • Li, Zheng||Liu, Guannan||Li, Qi

publication date

  • October 2017