Skip to main content
Community Contribution

Learning from Learning Efforts

Aug 06, 2015
Shawn Peabody

“In the rush to meet the program design deadline, we were keenly aware that we weren't making enough use of the existing knowledge and expertise that exists within USAID. I often wondered, what have others tried in similar situations? What are we missing? What hard-earned lessons should we be incorporating here?” USAID Environment Officer

I frequently hear Program Design Team members bemoan the lack of time to learn from others during program design. I think we’ve all been there. The comment above is not only universal, but it also succinctly summarizes what our team at Measuring Impact is trying to solve. Through a close partnership with the Bureau of Economic Growth, Education, and Environment’s Office of Forestry and Biodiversity (E3/FAB), we are building a Biodiversity Cross-Mission Learning Program to address this problem – with the ultimate goal of improving biodiversity conservation approaches across the USAID portfolio.

Cover of Measuring Impact Report

As readers of this blog will know, achieving success with social (peer-to-peer) learning efforts within the context of a single organization—and in a single country—can be difficult. Successfully implementing a learning effort across a global, decentralized organization where staff are charged with meeting many commitments is an even bigger challenge. So, before setting out to tackle this, we took some time to deepen our knowledge of the science of organizational learning, take a closer look at the institutional challenges to learning at USAID, and learn as much as we could from other social learning efforts within or supported by USAID. We conducted an extensive literature review and informal interviews with 17 gracious people who have led learning efforts with USAID and were willing to share their experiences. We detail all that we learned in this report, which I’ll note makes excellent bedtime reading.

In case your nightstand is already full, though, I’ll share with you here some of the highlights. I’m going to skip over those recommendations that would be old news to readers of this blog, such as the suggestion to use Learning Agendas and to solicit plenty of feedback from members to keep on top of issues and opportunities.

Focus on People, not Platform. Personal connections among participants are more important than the technological tools used to connect them. Platforms should be deployed to the minimum required specifications and then expanded if and when more functionality is needed. Experience shows that it is simply not true that if you build it (a website for learning) they will (automatically) come. The tools should not be the focus; rather, they should facilitate connections and learning. For example, more social activities in a network, including in-person meetings, peer-to-peer discussions on online forums, and webinars, were highlighted repeatedly as critical factors in promoting member engagement.

Diversify Knowledge Storage. Social learning efforts tend to focus on written documents to the exclusion of other forms of knowledge storage, but knowledge can also be stored in videos, infographics, podcasts, and other media and embedded in practices and organizational routines in groups, such as Communities of Practice. Diversifying knowledge storage reduces the risk of loss and improves the ability of knowledge seekers to locate the knowledge later. 

Record Learning Outcomes and Knowledge Products. Efforts should include monitoring how members use and apply knowledge gained from learning activities, not just the knowledge products, such as documents and workshops. The effort will have a clearer value when learning outcomes are monitored, and the lessons identified in the outcomes can be reinforced and improved through future activities. We second the Learning Lab's endorsement of outcome harvesting for this purpose. 

Strengthen Activity-level Links between Monitoring, Evaluation, and Learning. The quantity and quality of lessons generated from activities is in large part determined by the monitoring and evaluation (M&E) systems that provide feedback on the impacts of actions. M&E systems that focus on tracking key results in a theory of change and addressing questions that test assumptions underlying that theory of change can generate evidence to inform adaptive management of interventions, while speeding identification and verification of new lessons. By designing M&E approaches to support the Agency’s commitment to systematic learning and use of evidence, clearer lessons can be generated and compared across activities and projects to improve USAID’s investment in effective conservation and development approaches. 

Recognize Different Challenges for Internal- and External-Facing Social Learning Efforts. Internal social learning efforts focused on targeted groups within the Agency tend to have difficulty sustaining activities because they often lack facilitation resources and dedicated staff time. These efforts also tend to involve only a small number of people and suffer membership loss from staff turnover. This is less often the case with external-facing efforts, which usually engage many times more people. 

Several Agency Bureaus support external-facing social learning efforts run by outside implementing partners, which have dedicated staff and resources so they can maintain consistent leadership and online presence. The main challenge for these efforts is to maintain participant engagement, especially through online platforms. These efforts draw participants with wide-ranging interests from disparate geographic areas, which makes it difficult to identify relevant content for a large number of people while being specific enough to be novel and interesting. 

Undoubtedly, many of you are currently participating in or even leading social learning efforts. We’d love to hear your thoughts and reactions to either this blog post or to the full report. Have we missed any big lessons? Do you have any memorable experiences on this topic to share?  Please leave us a comment below or contact us directly- [email protected]