Orthogonal Machine Learning: Power and Limitations

From MaRDI portal
Publication:6293349

arXiv1711.00342MaRDI QIDQ6293349

Vasilis Syrgkanis, Ilias Zadik, Lester Mackey

Publication date: 1 November 2017

Abstract: Double machine learning provides sqrtn-consistent estimates of parameters of interest even when high-dimensional or nonparametric nuisance parameters are estimated at an n1/4 rate. The key is to employ Neyman-orthogonal moment equations which are first-order insensitive to perturbations in the nuisance parameters. We show that the n1/4 requirement can be improved to n1/(2k+2) by employing a k-th order notion of orthogonality that grants robustness to more complex or higher-dimensional nuisance parameters. In the partially linear regression setting popular in causal inference, we show that we can construct second-order orthogonal moments if and only if the treatment residual is not normally distributed. Our proof relies on Stein's lemma and may be of independent interest. We conclude by demonstrating the robustness benefits of an explicit doubly-orthogonal estimation procedure for treatment effect.




Has companion code repository: https://github.com/IliasZadik/double_orthogonal_ml








This page was built for publication: Orthogonal Machine Learning: Power and Limitations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6293349)