Let’s Talk About Another ROI - the Risk of Ignoring Collaborating, Learning, and Adapting

Oct 25, 2018 by Monalisa Salib Comments (5)

This blog post was written by Monalisa Salib, Deputy Chief of Party of the USAID LEARN contract.

Lake Turkana

Image credit

As collaborating, learning, and adapting (CLA) champions, you may have colleagues asking you, “well, what’s the return on investment - or ROI - of CLA?” In other words, “why should I invest in CLA?” It’s a fair question; anyone considering a new behavior or approach should consider the advantages and disadvantages of doing so.

But like many other things in life, it’s really hard to figure out the ROI on intangible things like collaborating, learning, and adapting1. While we’ve articulated what CLA practice looks like in the CLA maturity tool, it’s very difficult to go from that to measuring quantitatively the effect or contribution of improvements in CLA practice to organizational performance or development outcomes. This is not surprising or uncommon; work in peacebuilding, women’s empowerment, and resilience face similar measurement challenges. Qualitative data is much easier to come by (see the CLA case competition submissions and the CLA case competition analysis) but do not translate nicely into a definitive ROI message.

So when I first heard2 the idea of flipping ROI on its head, I was intrigued. Instead of trying to come up with quantitative data about the Return on Investment for CLA, what about sharing stories about the Risk of Ignoring CLA?

What do we risk by not systematically and intentionally collaborating, learning, and adapting? What do we stand to lose if we don’t resource and integrate CLA? Based on this (almost comical) development #fail story, the answer is: A LOT.

Activity: Lake Turkana fish processing plant, Kenya. Funded by the Norwegian government for $22 million (1971)3

Risk of Ignoring CLA (or where it went wrong): The activity was supposed to increase job opportunities for the Turkana people through fishing and fish processing for export. However, the Turkana are nomadic with no history of involvement in the fishing industry. The plant quickly shut down after only a matter of days.

How CLA integration could have helped avoid this: It’s fairly obvious that even just a little CLA in this case could have perhaps avoided this fail; implementers would have known more about the Turkana people and avoided setting up a static opportunity for a nomadic people. The risk of ignoring (ROI) here was at the very least $22 million and likely included the reputation and credibility of the implementing agency and trust between them and the Turkana people and Kenyan government.

While this story could be classified as extreme, it represents common errors we have all seen in development programming: not collaborating sufficiently with local stakeholders, not understanding the local context well enough to design relevant interventions, and not using a “fail fast” approach that avoids sinking millions into approaches unlikely to succeed.

This got me thinking about a thought experiment: what if our CLA Case Competition winners never intentionally integrated CLA? What could have happened? What would have been the ROI - Risk of Ignoring - CLA on development outcomes?

One of my favorite cases from 2018 was MSI and USAID/Senegal’s submission about evaluation use on a Government-to-Government (G2G) activity. Here’s the context: last year, USAID/Senegal undertook an evaluation for a G2G activity that was piloting an innovative national-to-regional funding scheme for health service delivery. This evaluation was well timed to inform the design of a follow-on activity. Instead of a traditional evaluation in which the evaluation team provides recommendations, the mission “CLAed” its evaluation by facilitating recommendations workshops among the primary stakeholders, including both the Mission and the government of Senegal. This resulted in an agreed upon list of recommendations owned by those responsible for designing and managing the follow-on.

The risk of ignoring CLA integration in this case would have likely been:

  • Wasted financial and staffing resources on an evaluation that wouldn’t have gotten used
  • A newly designed activity that wouldn’t have taken into account learning from the current activity
  • Potential tension with the government over how the activity should be adapted to improve development outcomes

However, because CLA was so intentionally integrated into the evaluation process, quite the opposite happened:

  • Recommendations were co-created with host government and USAID stakeholders and directly informed the design of the new activity
  • The relationships between the government of Senegal and USAID were strengthened - the government of Senegal called the approach used to develop recommendations in a collaborative way “revolutionary.”

There are countless examples like this of what could have happened had CLA not been integrated. The Risk of Ignoring needs to play into our calculus more when championing CLA approaches; it can often be a powerful caution sign to colleagues. So if your colleagues ask you about the return on investment of CLA, perhaps ask them about the risk of ignoring it.

Check out this resource for more information about Utilizing and Learning from Evaluations.

1As a side note, we investigated whether a cost-benefit analysis type of study would be feasible under our Evidence Base for CLA work and found that any conclusions would be heavily caveated, making this line of inquiry not worth the investment.

2Credit goes to Robert Otrembiak during a recent Organization Development Network webinar.



I find it interesting that there have been no comments on this blog post b/c it raises an important issue. CLA is a business process and like others; e.g., strategic planning, may well be "really hard" to assess in terms of pure ROI or cost-benefit analysis. But must we settle for purely qualitative methods that often border on no more than anecdotal evidence to assess the the approach's hypothesized benefits?

As Ms. Salib rightly points out, many things USAID does are associated with measurement challenges; that is the nature of much of what international development is about. However, difficult does not mean impossible. Tough-to-measure outcomes covering topics such as capacity building, resiliency and stabilization have all been evaluated using more robust quantitative techniques, generally involving indices. Is there some reason CLA should be exempt from similar scrutiny? 

Given that CLA has been in use for +/-5 years it would now be relatively straightforward to conduct ex-post evaluations of two similar projects, in any sector, one designed and implemented using CLA principles and the other not (the control), to assess the differences in outcomes, if any. More ambitiously, a learning-and-adapting index could be developed to assist assessment.  Perhaps this is something LER and/or LEARN is already considering.  "Learning about learning" would be very valuable at this point.   

posted 2 years ago
Joy Amulya wrote:

Thanks for the above comment. I agree it would be worth working together as a learning community to design an effective evaluation to look more systematically at the effects of using CLA. I imagine a mixed methods approach would help us move beyond case studies to provide robust evidence related to CLA use (or lack thereof). At my organization, we've developed a retrospective CLA qualitative research framework that provides the basis for an index capturing structures, management approaches, and practices supporting CLA. I'm sure others implementing CLA systematically also have good ideas for how to design a good impact evaluation of CLA effectiveness.

posted 2 years ago

Yes, a mixed method approach makes the most sense. Thinking out loud here, but pieces of this puzzle might include and combine Outcome Harvesting and process tracing. The former would identify the dependent variables, and the latter the independent variables. Changes in adaptive behaviour would be assessed as an intermediate outcome. The environment for learning (i.e., one independent variable) could be assessed with the help of an index built on frameworks such as your organization's and, I would think, aspects of the CLA Maturity Matrix. Without getting too complex, it would be useful to look beyond just the organizational factors to how adults learn, most likely based on the constructivism theories of Dewey and Kolb; i.e., experiential learning. Would somehow need to assess how the "learning environment" was actually reflected in both a project's design and implementation. One would need to look at both USAID and the IP in terms of adaptive practice outcomes.  

posted 2 years ago

Thomas and Joy: Thank you so much for engaging here and posting your comments. Our apologies for the delayed response to your concerns and questions. Measuring the hard to measure is definitely not something we want to shy away from.

One of USAID LEARN’s workstreams, Building the Evidence Base for Collaborating, Learning and Adapting (EB4CLA), has conducted several studies using a variety of methods to demonstrate CLA’s contribution to improved development results.  For a more quantitative look at CLA’s benefits, please check out an analysis we did of the Federal Employee Viewpoint Survey (FEVS). The data demonstrated that the relationships between CLA and employee empowerment, engagement, satisfaction and perceived organizational effectiveness proved to be strong, positive, and significant. A growing body of evidence from both quantitative and qualitative studies recognizes engagement, empowerment and satisfaction as critical to successful organizational performance. Additionally, our qualitative study of CLA cases from 2015-2017 shows strong links between CLA, improved organizational effectiveness, and better development outcomes. Last but not least, we leveraged process tracing, outcome harvesting, and outcome mapping to conduct a ‘deep dive’ into one CLA case. This analysis involved reviewing external evaluations and interviewing key informants to verify the results in the original case. The findings highlighted that CLA contributed to an increase in safe and dignified burials of Ebola victims in Liberia.  

Thanks again for sparking this conversation! Please check out our CLA Evidence Dashboard for links to more resources and studies about CLA’s benefits.

posted 2 years ago

Kristen - Thanks for your response to these comments. Appreciate the efforts being taken on this challenging subject. The 2018 analysis of the CLA case studies is particularly useful. I need to study the complete report more thoroughly, and hope to find evidence of the conclusion that "CLA practices integrated into programming improve development outcomes, organizational outcomes, or both." Based on reading of the Executive Summary of the report alone, however, it seems the evidence is still circumstantial at best. Perhaps time and more rigorous study (beyond reliance on case studies) will tell. Hope that the EB4CLA exercise continues its efforts in this regard. 

I suggest a fruitful avenue of inquiry involves the feedback loops highlighted in Finding 4. In doing so it is important to distinguish between single and double-loop learning, since it is the latter is can serve as the basis for a change or adaptation of the values underlying organizational knowledge, which when reflected in practices are known as "theories-in-use." Observing these is the key to extracting evidence that organizational learning has occured. While I take the point that "beginning with the experience of CLA rather than the theory" makes sense when introducing CLA practice into an organization such as a USAID Mission, research into the outcomes of the practice should be strongly grounded in well-regarded theory.  

posted 2 years ago