Lisa Denney, Research Associate at the Overseas Development Institute, goes in search of an elusive development beast.
Much has been written recently on the need for more flexible and adaptive development programming. This area has spawned considerable research attention across sectors, multiple workshops and communities of practice – but such ways of working in practice remain a little like the mythical sasquatch – reportedly existing but rarely seen. There are, of course, sightings, but they are rare and are far from constituting the mainstream of development programming. And often claims of sightings, when checked by others, turn out to be just footprints in the forest. So amidst the crescendoing whispers of flexible programming, I was pleased, while conducting action research for the Overseas Development Institute with the Asia Foundation in Cambodia, to sit in on what I considered to be a genuine effort to work in highly flexible and adaptive ways. This was through a process The Asia Foundation call ‘strategy testing’ (and conveniently, The Asia Foundation have just published a paper on how they’ve done it).
Under a partnership with the Australian Department for Foreign Affairs and Trade roughly 15 Asia Foundation programmes are being encouraged to ‘work flexibly’. This extends from programmes related to violence against women and economic development in Bangladesh, to land reform in Vietnam, to solid waste management in Mongolia and Cambodia, the programme I have been working alongside. Undoubtedly these programmes – like all others that claim to work flexibly – exist on a spectrum in which there are varying degrees of embrace of flexibility. There is a big difference, for instance, between changing some inputs in your logframe once or twice, to regularly revising your theory of change (and thus also your programme activities) to reflect changing dynamics. The strategy testing session I sat in on suggested this programme lies somewhere towards the more flexible end of this spectrum, with programme teams meeting every quarter to question underlying assumptions of programming, and commit to change where required.
Here’s how it worked.
The programme team all came together for a full day set aside for ‘strategy testing’. The Country Representative joined, adding in a level of oversight, but also some fresh ideas from someone slightly outside of the day-to-day of programming. The Program Manager sketched out on a board all the aspects of current programming (focused on making Phnom Penh cleaner) and asked the team about the assumptions implicit in the belief that these strategies would get them to their developmental goal. What had the team learned over the last few months that supported or questioned those assumptions? Where assumptions were no longer certain, these were highlighted as requiring further discussion or adjustment. Key blockages to getting the desired outcome were noted. Results from monitoring of programme activities were presented. Some things were getting better, some things appeared to be getting worse and the team dug into various reasons that might explain this, ultimately deciding to make a small investment in some deeper dive research to help explain some counterintuitive results (why wasn’t a pilot project working as intended?) and help refine the programme.
After lunch, alternative programming approaches were discussed. Should some activities be ditched and new ones tried? Should multiple strategies be tried simultaneously to see which yields better results? What is required to make an extra push on activities that seem stuck? What new relationships are needed that don’t exist already? By the end of the day, the team had decided to trial a new approach to the problem (to end the monopoly contract for waste collection), while continuing the existing approach (to improve the performance of the monopoly contract holder), and to re-visit in a couple of months to see which appears to be making more ground and whether one should become the dominant approach. This was not an insubstantial change. It required bringing in new staffing expertise around legal frameworks and waste management governance, engaging with new political counterparts and, potentially, winding down existing activities and relationships with the monopoly contract provider. It fundamentally altered the theory of change, the inputs and outputs of the programme, shifting from working within the existing system to deliver improved waste collection by improving contractor performance, to changing the waste collection system to enable competition between multiple providers to drive improved performance.
As one of the team members remarked, it felt like the ground had been pulled from under their feet. And this happens every three months – with the changes, the reasons for them and the implications summarised to constitute a quarterly report. It was incredibly refreshing to see such rigorous and apparently regular critical reflection on programming choices and effects that genuinely re-oriented the programme approach. And one gets the sense that working flexibly should feel unnerving for those involved, because it requires constant motion and regular change, reflecting the fact that not only is the context evolving, but that the team’s knowledge, relationships and thus opportunities are also evolving. Flexible programming should take advantage of that. Theories of change and programme strategies are thus never complete pictures – they are a best guess at a given time based on the team’s current knowledge.
Clearly not everyone can work this way (and maybe that’s not even desirable). You need the requisite budget flexibility to divert resources (though not necessarily large amounts) quickly. You need reporting flexibility so that the donor reporting requirements don’t limit your ability to change inputs, outputs and outcomes. And you need staff who can cope in that environment – who are limber enough to jump from working one way to another within days and quickly build the necessary relationships and knowledge to do so. This is in keeping with much of the literature on working flexibly, but it also showcased ways of working flexibly in practice that could be learnt from by other programmers. ODI and the Asia Foundation have been partnering over the last 18 months to document three of these flexible programmes to see how exactly they’ve done it – more on that soon. It’s that issue that the community of practice now needs to grapple with in order to move from talking about the benefits of working flexibly, and even examples of programmes that have done it, to the tools that make it possible that others can start using.