As a skilled statistician, data scientist, and researcher with 20 years of experience, I possess a unique combination of skills. I am well-versed in predictive analytics and machine learning, as well as in behavioral measurement using experiments and surveys. In my academic work, I developed a novel framework for designing and analyzing experiments that is useful in applied settings where analytical results are not readily available due to complex dependencies or spillover. I also have a long track record as a research practitioner, leveraging this expertise creatively to identify optimal research designs in challenging settings, including those with industry-specific constraints, conflict zones, or political campaigns.

Unifying Design-based Inference

In my academic work developed an elegant and general framework for experimental design and inference by way of novel notation and spectral analysis. The framework allows for the comparison of arbitrary designs, including cases in which no existing analytical results are available, either because of setting-specific research constraints or because of complex assignment dependencies or spillover effects.

Paper 1 of 4: On Bounding and Estimating the Variance of any Linear Estimator in any Experimental Design

Paper 2 of 4: A New Variance Estimation Principle

Paper 3 of 4: On Regression Adjustment for any Experimental Design

Paper 4 of 4: New Regression Estimators

Optimized Variance Estimation under Interference and Complex Experimental Designs

Selected Manuscripts

How to Account for Alternatives When Comparing Effects

Exact Bias Correction for Linear Adjustment of Randomized Controlled Trials

A Unified Theory of Regression Adjustment for Design-based Inference

A Class of Unbiased Estimators of the Average Treatment Effect in Randomized Experiments

Unbiased Estimation of the Average Treatment Effect in Cluster-Randomized Experiments

Bias Amplification and Bias Unmasking

Contact

joel.middleton [at] gmail.com