Skip to yearly menu bar Skip to main content


Poster

Kernel Debiased Plug-in Estimation: Simultaneous, Automated Debiasing without Influence Functions for Many Target Parameters

Brian Cho · Ivana Malenica · Kyra Gan · Yaroslav Mukhin


Abstract:

In the problem of estimating target parameters in nonparametric models with nuisance parameters (e.g., the data generating distributions), substituting the unknown nuisances with nonparametric estimators can introduce “plug-in bias.” Traditional methods addressing this sub-optimal bias-variance trade-offs rely on the influence function (IF) of the target parameter. When estimating multiple target parameters, these methods require debiasing the nuisance parameter multiple times using the corresponding IFs, posing analytical and computational challenges. In this work, we leverage the targeted maximum likelihood estimation framework to propose a novel method named kernel debiased plug-in estimation (KDPE). KDPE refines an initial estimate through regularized likelihood maximization steps, employing a nonparametric model based on reproducing kernel Hilbert space. We show that KDPE (i) simultaneously debiases all pathwise differentiable target parameters that satisfy our regularity conditions, (ii) does not require the IF for implementation, and (iii) remains computationally tractable. We numerically illustrate the use of KDPE and validate our theoretical results.

Live content is unavailable. Log in and register to view live content