Contact FutureLearn for Support
Skip main navigation
We use cookies to give you a better experience, if that’s ok you can close this message and carry on browsing. For more info read our cookies policy.
We use cookies to give you a better experience. Carry on browsing if you're happy with this, or read our cookies policy for more information.

“Reshaping institutions: Evidence on aid impacts using a pre-analysis plan” by Katherine Casey, Rachel Glennerster, and Edward Miguel

This article is a good example of when a pre-analysis plan was useful. Katherine Casey, Rachel Glennerster, and I exploited the random assignment and participation of communities in a governance program to evaluate its impact. Our use of a pre-analysis plan turned out to be very valuable in improving the validity of our findings.

We set out to assess whether a “community-driven development” project, implemented in post-conflict Sierra Leone, improved local economic outcomes and institutional capacity in the form of democratized action, decision-making, and inclusivity. We found short-run positive effects on local economic outcomes, but no evidence of strengthened institutions.

In the paper, we discuss the general benefits of using a pre-analysis plan, or PAP:

“While the experimental framework naturally imposes some narrowing of econometric specifications, there is still considerable flexibility for researchers to define the outcome measures of interest, group outcome variables into different hypothesis ‘families’ or domains, identify population subgroups to test for heterogeneous effects, and include or exclude covariates. PAPs are arguably particularly valuable, therefore, when there are a large number of plausible outcome measures of interest and when researchers plan to undertake subgroup analysis.”

Moreover, “[t]he process of writing a PAP may have the side benefit of forcing the researchers to more carefully think through their hypotheses beforehand, which in some cases could improve the quality of the research design and data collection approach.”

We were careful to include some of the risks and trade-offs involved in using a PAP, including the concern “that important hypotheses will be omitted from the initial plan,” and “that the exact econometric specification laid out in advance does not describe the data as well as one that would have been chosen ex post if the authors had first ‘let the data speak,’ potentially leading to less precise estimates.”

We offer suggestions of ways to mitigate these risks and “advocate a compromise position that allows some researcher flexibility accompanied by the ‘price tag’ of full transparency—including a paper trail of exactly what in the analysis was prespecified and when, and public release of data so that other scholars can replicate the analysis— with the hope that this approach will foster the greatest research progress.

Overall, the value of using a PAP is clear, especially when the results of this registered study are compared with those of an unregistered evaluation of the same program. As explained in the previous video, an unregistered study carried out by the World Bank found more positive impacts than did we. It is difficult to tell, however, if selective reporting or specification searching was involved in the analysis.

You can read the whole paper, the Pre-Analysis Plan, and a summary of our evaluation for J-PAL by clicking on the links in the SEE ALSO section at the bottom of this page.


Reference

Casey, Katherine, Rachel Glennerster, and Edward Miguel. 2012. “Reshaping Institutions: Evidence on Aid Impacts Using a Preanalysis Plan*.” The Quarterly Journal of Economics 127 (4): 1755–1812.

Share this article:

This article is from the free online course:

Transparent and Open Social Science Research

University of California, Berkeley