And so I found myself at the London International Development Centre this week in one of those periodic soul searchings about how to get NGOs and researchers to work together more and better. Some observations:
I’ve posted before on this topic, but this time there was a sense that things are changing – a number of academics seem to have moved beyond seeing NGOs as just sources of data and dissemination of their findings, and NGOs are more interested in joint research. Lots of stories of deep collaboration, and probably a need to map such examples and see what can be learned from them.
Two new drivers stand out as reasons for this increased level of interest – from the NGO side the demands from funders that they sharpen up on evidence and results; from the academics, the pressure (in the UK anyway) to demonstrate the impact of their work in order to guarantee future research funding through the Research Excellence Framework exercise (which is driving everyone nuts at the moment). There are some pots of money from DFID, ESRC etc backing this up, eg Research for Health in Humanitarian Crises.
These new pressures are additional to pre-existing ones – academics’ personal commitment to change, plus a desire to get access to guinea pigs and data for their theories; NGOs’ desire to have more impact, solve problems and generally understand what the hell is going on.
What emerged for me were some do’s and don’ts:
– Individual bilateral relationships between NGO folk and academics seem more durable and useful. It means you know who you trust and respect in advance, and so can pick up the phone when the need arises. How can we deepen and broaden such networks, given how busy everyone already is? It takes time and investment, but who is going to fund it? ‘No-one’s investing in the air quality of the relationships’.
– The clearer the joint purpose, the more likely it is to work out (eg Sightsavers’ Trachoma Mapping project, but also, arguably, the various post-2015 collaborations)
– Beyond research, a lot of universities are now exploring training as a way of raising cash and using their expertise in more practical ways. Advocacy training emerged as a possibility (although I’ve never found academic papers on advocacy very convincing.)
What doesn’t work (at least in my experience)
– I’m a bit sceptical of database-type clearing houses – ignores the issue of how to build trust. That said, it depends how they are done. Oxfam has had a good experience with the Research Matching Facility a dating agency where we were able to air a problem (improving methodology for field staff impact assessment) and then interview potential academic partners to help solve it (we went with UEA and are very happy with the partnership)
– Academics trying to park their students with you, usually in August when everyone is on holiday (or in blue-sky sessions like this one). A lot of work needed to align student and NGO needs.
– Grandiose attempts to forge strategic partnerships from scratch (although building on existing links might work).
There were useful suggestions for building deeper ‘evidence literacy’ among NGO staff who currently can fall into two equally unhelpful extremes: ignore evidence in favour of personal preference, or place unwarranted faith in hard numbers, however dodgy their provenance. In a classic example of NGOs’ tendency to be ‘more Catholic than the Pope’ on stuff they don’t really understand, evaluation academics reported that they get loads of NGO demand for numbers, but barely a flicker on process evaluation or realtime accompaniment – the things they are currently interested in exploring. ‘Everyone’s OK with the expensive household survey, but not research for learning.’
The role of research funders in all this has both positive and negative aspects. On the positive side, things like rapid response funding to enable researchers and NGOs to seize opportunities, but on the negative a resistance to learning approaches that emphasise qualitative methodologies and responding to events (it really messes up your logframe).
Some intriguing suggestions for how to tackle this. Given that DFID is so influenced by the big foundations like Gates and Wellcome, who may be more willing to innovate/ less risk averse (no ferocious backbenchers looking over their shoulders), people reckoned the way to shift thinking may be to target the Foundations first, rather than go straight for Government. Anyone got a decent influencing strategy for ushering Gates along the road to mixed methods, qualitative rigour and learning by doing?
So how do we build the density of relationships? Standard academic conferences are coma-inducing. I’ve been pushing speed dating (no you can’t read anything into that): 10 x 2 minute pitches instead of 1 x 20 minute panel presentation, then plenty of time for people to chat to the ones they were most interested in – a combination of Open Mike and Happy Hour.
Our recent data dive (blog in the works on that from some of the participants) might offer a model for non-data work as well – a brainstorm model,aiming at finding specific answers to real problems (eg what do we do about the climate change driven appearance of frost in Zambia?), generating funding proposals (money talks) or pilot projects. Could funders convene such things, with a promise to look kindly on the results?
I cynically wondered if post REF, the normal academic incentive system would kick in and working with NGOs would become frowned on again, but the accies in the room assured me that impact is here to stay, and with it the importance of academics learning to work with partners who can communicate in more accessible terms than they can, (which often means NGOs).
Top candidates for greater collaboration?
– Find better ways to involve students (Masters and/or Doctoral)
– The MEL/evidence/results agenda
– Training (but I’m sceptical about advocacy)
– Longitudinal work – we need to do it, but need help designing, funding and following through
– Realtime accompaniment: embed a researcher with some of our programme work and let them observe the real process of chaos and improvisation, rather than the retrospective coherence of a post hoc evaluation
All in all, a useful exchange and some good materials are available. Check out ELRHA’s Guide to Constructing Effective Partnerships (v practical and user-friendly) and INTRAC’s briefing note, based on its ‘Cracking Collaboration’ initiative.