Parameterization and inference for nonparametric regression problems Academic Article uri icon

abstract

  • We consider local likelihood or local estimating equations, in which a multivariate functionΘ (·) is estimated but a derived function λ(·) of Θ(·) is of interest. In many applications, when most naturally formulated the derived function is a non-linear function of Θ(·). In trying to understand whether the derived non-linear function is constant or linear, a problem arises with this approach: when the function is actually constant or linear, the expectation of the function estimate need not be constant or linear, at least to second order. In such circumstances, the simplest standard methods in nonparametric regression for testing whether a function is constant or linear cannot be applied. We develop a simple general solution which is applicable to nonparametric regression, varying-coefficient models, nonparametric generalized linear models, etc. We show that, in local linear kernel regression, inference about the derived function A(·) is facilitated without a loss of power by reparameterization so that λ(·) is itself a component of Θ(·). Our approach is in contrast with the standard practice of choosing Θ(·) for convenience and allowing λ(·) to be a non-linear function of Θ(·). The methods are applied to an important data set in nutritional epidemiology.

author list (cited authors)

  • Jiang, W., Kipnis, V., Midthune, D., & Carroll, R. J.

citation count

  • 5

publication date

  • January 2001

publisher