The international development sector is increasingly obsessed with evidence, results, and value for money. Most aid workers have horror stories about this trend; being told that ‘empowerment’ is not a SMART indicator, asked to quantify the value of training, or struggling to fit a complex project into a rigid proposal format,
This ‘evidence agenda’ has driven a fierce debate. Proponents claim that a focus on results increases the quality of aid programmes, and ensures that scarce resources are well used. Critics argue that donors over-emphasise quantitative, experimental evidence – typified by the increasing role of randomised control trials. They believe that this encourages programmes to focus on easily countable outputs, such as distributing buckets or injecting babies. It prevents staff from working on long-term, more transformative development, which might not be easily counted or photographed.
These fascinating questions have been discussed in depth on FP2P’s Famous Wonkwar and the Big Push Forwards conference. However, these debates have a tendency to become stuck in a loop, preaching to the converted and damning the unbelievers.
With this in mind, I present five recommendations for the vast majority of aid workers stuck in the middle. They accept the right of donors to demand accountability – but are worried that this might undermine accountability to governments and communities in the countries where they work. They understand the importance of effective monitoring and evaluation – but struggle under the weight of unrealistic or downright absurd requirements.
First, assume good faith. Both sides of the debate have a tendency to demonise their opponents. But proponents of evidence-based approaches aren’t trying to return Africa to a colonial state, as is sometimes sinisterly implied. Similarly, critics of quantitative research aren’t trying to hide a terrible aid programme. Assume that the person you’re talking with is genuinely interested in doing the best job they can, and everything will go much better.
Second, find a common language. The development sector has enough jargon, without inventing more for the purposes of this argument. Many critics of the evidence agenda root their argument in concepts of different epistimologies and forms of knowledge, which don’t help anyone outside a narrow academic background. The same goes for economists who wield quasi-experimental jargon like a battering ram. Keep language accessible, and tell off others when they don’t do the same.
Third, pick your battles carefully. You may think that ‘counting beneficiaries’ is an insulting, top-down simplification of the complexity of your programme. It doesn’t matter – in all likelihood, you’ll still have to do it. Save your effort for more important arguments, such as when you’re asked to restructure your programme to fit donor priorities, or your advocacy project is rejected based on a cost benefit analysis.
Fourth, find common ground. Stereotypes about narrow-eyed, viciously-focused calculator-wielding economists are seldom fair. The maestros of randomised control trials – Abhijit Banerjee and Esther Duflo – make plentiful use of qualitative data, continually trying to understand why interventions do or don’t work. Almost all development workers are well aware of the limitations under which they work. Be honest about the problems with the tools you use, and you’ll often find that you’re not alone.
Finally, distinguish between inappropriate tools and appropriate tools used badly. In general, even the worse methodologies have some value. Seek it out, and use it to your advantage. For example, logical frameworks have a terrible reputation – but for simple programmes, they’re excellent ways to focus attention on a single goal and achievable outcomes. Even for complex programmes, rigorous thinking can be helpful in clarifying exactly what you want to achieve. The problems come when they cease to be tools, and become an unchangeable statement of what the project aims to achieve. Try to recognise what is and isn’t valuable, and use it accordingly.
NOTE: We normally put rough notes on all our post – the discussion between the Aidleap team prior to posting. Unfortunately this time they were too busy to develop rough notes – so please add your own comments below and we’ll reply.