LJAF works to develop and support initiatives that encourage governments and nonprofit organizations to help build the evidence base for social interventions and to consider reliable evidence as one of the primary factors in their decisions. The goal is to ensure that limited resources are spent wisely on programs that produce meaningful, lasting improvements in people’s lives. We promote new government and philanthropic funding models that emphasize the use of evidence and facilitate strategic partnerships to accelerate progress and support efforts to expand access to privacy-protected data.
In addition, LJAF is bringing policymakers, researchers, and data experts from the public and private sectors together to strengthen the infrastructure and processes needed to support evidence-based decision making. We promote new government and philanthropic funding models that emphasize the use of evidence. We also facilitate strategic partnerships to accelerate progress and support efforts to expand access to privacy-protected data.
Many of LJAF’s grants in evidence-based policy focus on the key goal of building the body of social interventions backed by strong, replicated evidence of sizable effects on important outcomes such as educational achievement, workforce earnings, criminal arrests, substance abuse, and hospitalizations. Specifically, we seek to fund rigorous evaluations – mainly randomized controlled trials (RCTs) – of interventions backed by promising prior evidence, with the purpose of hopefully moving them into the strong evidence category. Please see the website’s Request for Proposals page for funding opportunities in this area.
As an illustrative example, we are funding a large, multi-site RCT of Bottom Line – a program that provides one-on-one guidance to help low-income, first-generation students get into and graduate from college. This study is measuring college enrollment, persistence, and completion outcomes for a sample of about 2,400 students over a seven-year period. Early impact findings are promising (e.g., an 8 percentage-point increase college enrollment in the second year after random assignment, and a 14 percentage-point increase in enrollment in a four-year college, versus the control group).
We also systematically monitor the program evaluation literature, and report on rigorous evaluation findings on our Social Programs That Work and Straight Talk on Evidence websites. We invite readers to subscribe to the Evidence-Based Policy team’s newsletter.