Timezone: »
Poster
Debiased Machine Learning without Sample-Splitting for Stable Estimators
Qizhao Chen · Vasilis Syrgkanis · Morgane Austern
Estimation and inference on causal parameters is typically reduced to a generalized method of moments problem, which involves auxiliary functions that correspond to solutions to a regression or classification problem. Recent line of work on debiased machine learning shows how one can use generic machine learning estimators for these auxiliary problems, while maintaining asymptotic normality and root-$n$ consistency of the target parameter of interest, while only requiring mean-squared-error guarantees from the auxiliary estimation algorithms. The literature typically requires that these auxiliary problems are fitted on a separate sample or in a cross-fitting manner. We show that when these auxiliary estimation algorithms satisfy natural leave-one-out stability properties, then sample splitting is not required. This allows for sample re-use, which can be beneficial in moderately sized sample regimes. For instance, we show that the stability properties that we propose are satisfied for ensemble bagged estimators, built via sub-sampling without replacement, a popular technique in machine learning practice.
Author Information
Qizhao Chen (Harvard University)
Vasilis Syrgkanis (Stanford University)
Morgane Austern (Harvard)
More from the Same Authors
-
2022 Spotlight: Partial Identification of Treatment Effects with Implicit Generative Models »
Vahid Balazadeh Meresht · Vasilis Syrgkanis · Rahul Krishnan -
2022 Poster: Robust Generalized Method of Moments: A Finite Sample Viewpoint »
Dhruv Rohatgi · Vasilis Syrgkanis -
2022 Poster: Partial Identification of Treatment Effects with Implicit Generative Models »
Vahid Balazadeh Meresht · Vasilis Syrgkanis · Rahul Krishnan -
2021 Poster: Asymptotics of the Bootstrap via Stability with Applications to Inference with Model Selection »
Morgane Austern · Vasilis Syrgkanis