PROGRAMME NEWS

How to measure impact of south-south research and knowledge exchange initiatives?

8720682418_222e52e16c_o-2

South-south knowledge exchange and sharing is widely agreed to be a good idea. Its benefits are many but so also are its challenges. There is plenty of interesting literature on these challenges and although there are no definite answers on the best recipe for south-south sharing, we at ELLA can offer some pointers on how best to approach it, and advocate for it as a good investment. After all that is what we do. We mix research, exchange and learning to inspire development policies and practices that are grounded in evidence about what works in varied country contexts. Sharing experiences across the ‘global south’.

The challenge we want to discuss here is the difficulty of demonstrating the impact of these initiatives, and on what we have done to overcome this.  In its first phase, ELLA’s objective was to allow decision makers in Africa and Asia better access to knowledge about what has worked in Latin America, what has not, and why, across a range of development issues. The intention was for these policymakers and practitioners to understand and use this knowledge to incorporate the evidence and lessons into their own decisions (an impact on policy and practice). The knowledge that ELLA generated was to be disseminated far and wide to many potential users, in many organisations and across many countries. Thus, we had two main issues that we needed to tackle: identifying the link from research and knowledge, to policy and practice; and how to show that that link was made because of ELLA.

To frame the problem we were dealing with, Carden´s (2009) Knowledge to Policy book argues in relation to the research to policy link, that “At best, research is only one element in the fiercely complicated mix of factors and forces behind any significant governmental policy decision. Policies in most governments, most of the time, are the outcomes of all the bargains and compromises, beliefs and aspirations, and cross-purposes and double meanings of ordinary governmental decision making.” (Section 1, Chapter 2). And in relation to impact, Morton (2015) argues that the difficulty of demonstrating the impact of research and knowledge on policy and practice is partly due to the fact that research tends to have an indirect impact, as it is used partially or in modified form, and that it influences debates over a long period of time, usually in iterative ways.  She highlights that the key issues for assessing the impact on policy and practice are issues of timing, attribution and addressing context.

In this complex environment, with different actors, caveats and the impossibility of tracing a straight line from research to policy influence, doing Monitoring and Evaluation is difficult.  But we started with the same questions as in any other development programmes WHY are we doing monitoring and evaluation, WHAT specific objectives are we trying to achieve, WHO are we working with and HOW are we going to do this? And to answer that we needed to go back to our log frame.

ELLA´s higher impact aim is to have Poverty reduced and quality of life of poorer people improved, predominantly in Africa[1], as a result of decision makers using evidence from Latin America to inform policies and practices. To achieve this, decision makers, policy influencers and practitioners need to have seen and learnt from Latin American (and comparative African) evidence (on selected issues). And this is where we start. How do we provide the evidence that this is happening? How can we keep track of the different sorts of impacts we are having? How can we make sure that the research we are producing and sharing is reaching our key stakeholders?

This is definitely not straightforward, but as practitioners convinced of the value of south-south sharing initiatives, it was paramount to find ways of capturing our impact and to demonstrate value.

In the first phase of ELLA (2011-14), we had a long list of tools that included surveys of participants, website traffic indicators and the usual tools used for M&E. The innovation and what was most useful was recording “impact cases”. This was done through our partners in Africa; they were responsible for capturing any and every testimony that might be useful. They actively looked out for and kept a register of comments, quotes and any evidence of where ELLA knowledge was having an impact.  In their reports, our partners tracked the feedback from the learning alliances members (online, national learning groups’ meetings and study tours) to track what people were saying about how they were using ELLA knowledge and learning. Surveys and interviews also played their part, participants from the LEAs, study tours and National Learning Groups also completed surveys. We then put together these cases and by reviewing each one we analysed and categorized all the different sort of evidence of where the ELLA knowledge had had an impact. This enabled us to have tangible definitions to construct a framework under which to gather all these registered cases but also enabled us to understand the different monitoring demands of each one. After this analysis we came up with the following categories of impact of ELLA knowledge:

  • Individual development: inspiration, a new vision, new ideas
  • Organisational development: realignment of thinking, policy, strategy and practices
  • ELLA knowledge as reference material: for designing projects, for answering enquiries, as material for training and for students
  • Support for developing research ideas and proposals
  • Sharing with partners and with networks, for debate and discussion
  • Input for evidence-informed advocacy campaigns
  • Informed public debate and discussion of public policies
  • Supported the development and building of new partnerships and networks within countries, across countries and across continents

These categories, although not clear cut and with some overlaps, helped in giving an order and sense to all the information gathered. After that, we registered and counted the impact cases more easily, which led us to realise that ELLA significantly exceeded its logical framework indicator targets. This framework also helped us in identify interesting cases, for example that ELLA knowledge and learning was being used in setting up an extractive industries think tank in East Africa. That it was used by an adviser to the Comprehensive African Agriculture Development Programme.  That it served as an input to the Kenyan Climate Change Action Plan and to re-think climate related disaster risk management at municipal level in South Africa. These, among many others, were some of the impacts of the first phase of ELLA.

But Monitoring and Evaluation is incomplete without Learning and for ELLA2 we have put into practice many of the lessons we learned in the first phase. The list of lessons is long and many improvements have been made to the programme based on what we learned and what worked in phase one.

The M&E plan for the second phase of ELLA (2014-17) aims to ensure that we have a clear picture of those accessing the research, and those discussing and using it. The nature of the ELLA programme and the planned research uptake activities (online learning alliances, study tours, etc.) give us the advantage of being able to access good data not only on those accessing it but also the discussions they have around it. The biggest challenge though is trying to measure the use of this knowledge and following up on its impact. To overcome this, we have selected several tools that will let us asses and follow up on this impact. In this round we are using a standardised Research Uptake log[2] from the beginning. This will be managed by our research centres and in it they will record any impact they encounter. They will keep track of the main activities that happen and record comments, anecdotes and any informal or anecdotal evidence about the use of research or advice (and who was involved in it whenever possible). This can prove to be a useful tool to contribute to a deeper analysis once a couple of stories have been accumulated or to follow up on particular cases for an in-depth analysis. We will also build upon the categories we created in the first phase of ELLA and continue to develop the typology of ELLA impacts. We will also work more closely with our partners (Research Centres) to develop the standard tools. The research centres will have the lead in applying the surveys and keeping the logs while we lead on drafting the surveys, getting the input, analysing results from the six learning alliances, and contracting two impact assessments on the learning alliances. We are sure this will lead to rich material that will enable us to understand the impact that the programme has had, and extract interesting lessons for future ELLA work.

By

Alicia Quezada, Practical Action Consulting Latin America Manager

Andrea Baertl H., ELLA Programme Research Uptake Adviser

[1] In ELLA 1 we focused on Africa and Asia.

[2] We were inspired to do this also by Hovland´s paper (2007) which suggests (along with  Jones, 2011) to keep of logs (progress journals, uptake logs, impact logs) to keep track of the direct responses that research outputs trigger.

Photo credit: Jones, Leslie

Share this page:
Latest news:

[ELLA Community blog series] Pastoralism and Culture in Karamoja Region

[ELLA Community blog series] Where is the Problem? Skepticism about Uganda’s “Oil 2020″ dream

Community scorecards for Cape Town’s townships

Using water efficiently to tackle climate challenges in Nepal

A climate change action plan for Kenya

Alternatives for citizen oversight in the Delta State, Nigeria

Accessing Cameroon’s Forestry Information

Policymakers and practitioners meet in South Africa to Exchange knowledge on informality in Africa and Latin America

Unusual comparative research and learning between countries (Evidence Week – Lima, Peru)

Leapfrogging for law improvement in Togo

ELLA Outreach

Advocating for contract farming in Nigeria

Land use policy in Myanmar

From Peruvian potatoes to Ugandan coffee

Kids Prepared For Disasters

Being a better advocate locally by looking internationally

Chuluke Chuluke: Community Radio and Agricultural Policy Making in Malawi

New ELLA Learning Alliances Now Open for Registration

Fruits from Comparing Apples and Oranges

How to improve your Research User Ratings

How do you compare Latin American and African development experiences?

Latin American Thinks Tanks: Elections, Impact Evaluations, South-South Cooperation

ELLA Research: Infographic

Interview with Akosua Darkwah

ELLA Infographic

ELLA: FUNDAR, Mexico’s perspective (video)

ELLA through our partner´s eyes (Video)

ELLA Comparative Research: One Year In

The evolution of the ELLA programme