My latest project proposes an elegant and general framework for experimental design and inference by way of novel notation and spectral analysis. The framework allows for the comparison of arbitrary designs, including cases in which no existing analytical results are available, either because of setting-specific research constraints or because of complex assignment dependencies or spillover effects.

Project: Unifying Design-based Inference

Paper 1 of 4: On Bounding and Estimating the Variance of any Linear Estimator in any Experimental Design

Paper 2 of 4: A New Variance Estimation Principle

Paper 3 of 4: On Regression Adjustment for any Experimental Design

Paper 4 of 4: New Regression Estimators

Optimized Variance Estimation under Interference and Complex Experimental Designs

Selected Manuscripts

How to Account for Alternatives When Comparing Effects

Exact Bias Correction for Linear Adjustment of Randomized Controlled Trials

A Unified Theory of Regression Adjustment for Design-based Inference

A Class of Unbiased Estimators of the Average Treatment Effect in Randomized Experiments

Unbiased Estimation of the Average Treatment Effect in Cluster-Randomized Experiments

Bias Amplification and Bias Unmasking


joel.middleton [at]