Abstract
We develop a model-free theory of general types of parametric regression for i.i.d. observations. The theory replaces the parameters of parametric models with statistical functionals, to be called “regression functionals,” defined on large nonparametric classes of joint ${x\textrm{-}y}$ distributions, without assuming a correct model. Parametric models are reduced to heuristics to suggest plausible objective functions. An example of a regression functional is the vector of slopes of linear equations fitted by OLS to largely arbitrary ${x\textrm{-}y}$ distributions, without assuming a linear model (see Part I). More generally, regression functionals can be defined by minimizing objective functions, solving estimating equations, or with ad hoc constructions. In this framework, it is possible to achieve the following: (1) define a notion of “well-specification” for regression functionals that replaces the notion of correct specification of models, (2) propose a well-specification diagnostic for regression functionals based on reweighting distributions and data, (3) decompose sampling variability of regression functionals into two sources, one due to the conditional response distribution and another due to the regressor distribution interacting with misspecification, both of order $N^{-1/2}$, (4) exhibit plug-in/sandwich estimators of standard error as limit cases of ${x\textrm{-}y}$ bootstrap estimators, and (5) provide theoretical heuristics to indicate that ${x\textrm{-}y}$ bootstrap standard errors may generally be preferred over sandwich estimators.
Citation
Andreas Buja. Lawrence Brown. Arun Kumar Kuchibhotla. Richard Berk. Edward George. Linda Zhao. "Models as Approximations II: A Model-Free Theory of Parametric Regression." Statist. Sci. 34 (4) 545 - 565, November 2019. https://doi.org/10.1214/18-STS694
Information