3 Tests how robust the MLE method for independent variables with differential error is when the model for $X$ is less precise. In the main paper, we include $Z$ on the right-hand-side of the `truth_formula`.
4 In this robustness check, the `truth_formula` is an intercept-only model.
5 The stats are in the list named `robustness_1` in the `.RDS` file.
7 # robustness\_1\_dv.RDS
9 Like `robustness\_1.RDS` but with a less precise model for $w_pred$. In the main paper, we included $Z$ in the `outcome_formula`. In this robustness check, we do not.
13 This is just example 1 with varying levels of classifier accuracy indicated by the `prediction_accuracy` variable..
17 Example 3 with varying levels of classifier accuracy indicated by the `prediction_accuracy` variable.
21 Example 1 with varying levels of skewness in the classified variable. The variable `Px` is the baserate of $X$ and controls the skewness of $X$.
22 It probably makes more sense to report the mean of $X$ instead of `Px` in the supplement.
26 Example 3 with varying levels of skewness in the classified variable. The variable `B0` is the intercept of the main model and controls the skewness of $Y$.
27 It probably makes more sense to report the mean of $Y$ instead of B0 in the supplement.
31 Example 2 with varying amounts of differential error. The variable `y_bias` controls the amount of differential error.
32 It probably makes more sense to report the corrleation between $Y$ and $X-~$, or the difference in accuracy from when when $Y=1$ to $Y=0$ in the supplement instead of `y_bias`.
36 Example 4 with varying amounts of bias. The variable `z_bias` controls the amount of differential error.
37 It probably makes more sense to report the corrleation between $Z$ and $Y-W$, or the difference in accuracy from when when $Z=1$ to $Z=0$ in the supplement instead of `z_bias`.