Lant Pritchett discussed a new working paper, which reframes the impact evaluation debate. Monitoring and Evaluation (M&E) has always been an element of implementing organizations’ accountability to their funders, and recently there has been a push for much greater rigor in evaluations to isolate causal impacts and enable more ‘evidence based’ approaches to accountability and budgeting. Pritchett and his co-author extend the idea of impact evaluation, and show that the techniques of impact evaluation can be directly useful to implementers, rather than a potentially threatening accountability mechanism. They introduce the concept of experiential learning (“e”), which allows implementing agencies to leverage monitoring data to search across alternative project designs. Within-project variations in design can serve as their own counter-factual, dramatically reducing the incremental cost of evaluation and increasing the usefulness of evaluation to implementers. The right combination of M, e, and E provides the right space for innovation and organizational capability building, while at the same time providing accountability and an evidence base for funding agencies.
CGDev Video: It’s All about MeE – Project Design by Experiential Learning
Lant Pritchett discussed a new working paper, which reframes the impact evaluation debate. Monitoring and Evaluation (M&E) has always been an element of implementing organizations’ accountability to their funders, and recently there has been a push for much greater rigor in evaluations to isolate causal impacts and enable more ‘evidence based’ approaches to accountability and budgeting. Pritchett and his co-author extend the idea of impact evaluation, and show that the techniques of impact evaluation can be directly useful to implementers, rather than a potentially threatening accountability mechanism. They introduce the concept of experiential learning (“e”), which allows implementing agencies to leverage monitoring data to search across alternative project designs. Within-project variations in design can serve as their own counter-factual, dramatically reducing the incremental cost of evaluation and increasing the usefulness of evaluation to implementers. The right combination of M, e, and E provides the right space for innovation and organizational capability building, while at the same time providing accountability and an evidence base for funding agencies.