Seminar 217, Risk Management: Better Lee Bounds (Online)

Header section: []
Submitted: []
Submitted by Brandon Eltiste on January 04, 2021
Event info: []
URL:
Location:
Online
Event Type:
Time:
Tuesday, April 13, 2021 - 11:00
About this Event

Vira Semenova, UC Berkeley

ABSTRACT: This paper develops methods for tightening Lee's (2009) bounds on average causal effects when the number of pre-randomization covariates is large, potentially exceeding the sample size. These Better Lee Bounds are guaranteed to be sharp when few of the covariates affect the selection and the outcome. If this sparsity assumption fails, the bounds remain valid. I propose inference methods that enable hypothesis testing in either case. My results rely on a weakened monotonicity assumption that only needs to hold conditional on covariates. I show that the unconditional monotonicity assumption that motivates traditional Lee bounds fails for the JobCorps training program. After imposing only conditional monotonicity, Better Lee Bounds are found to be much more informative than standard Lee bounds in a variety of settings.