Four Takeaways from USAID's First Evaluation Summit

Oct 26, 2018 by Frank Higdon Comments (0)

This post was written by Frank Higdon, Evaluation Team Lead, Office of Learning Evaluation and Research, Bureau for Policy, Planning and Learning

As USAID reorients how we do business to focus on supporting countries on the Journey to Self-Reliance, how can evaluation support that shift and how can we best use the lessons of the past to program for the future? In September 2018, more than 180 USAID staff and partners gathered in Washington, DC to begin to answer those questions. USAID’s inaugural Agency-wide Evaluation Summit, titled “The Role of Evaluation on the Journey to Self-Reliance,” provided a timely opportunity for Agency staff and partners to engage.

The Summit’s objectives were to advance knowledge about evaluation approaches and inform an evaluation action plan for self-reliance; contribute to strengthening USAID evaluation approaches that support countries on their Journey to Self-Reliance, which includes improved understanding of sustainability, local ownership, and local systems; and provide an opportunity for peer-to-peer sharing and collaboration.

people sitting at tables

The event featured more than 20 sessions and 50 presentations and allowed approximately 100 USAID staff from 50 missions and headquarters to share and learn from each other as they considered the important role that evaluation plays in supporting self-reliance. On the final day of the Summit, about 85 external participants from other federal agencies, firms providing evaluation services, and organizations advocating for strong evaluation practices joined to share their insights and perspectives.

The Bureau for Policy, Planning, and Learning’s Evaluation Team, who hosted the event, distilled  discussions during the Summit into four major takeaways:

  1. USAID's body of evaluation evidence could be strengthened through increased collaboration with local stakeholders.

    • Participants were evenly divided on whether the agency has a solid body of evaluation evidence to inform strategy and project design with self-reliance outcomes.

    • The strongest evaluation evidence was noted in these areas: 1) local systems strengthening and capacity development; 2) engaging and strengthening local partnerships; and 3) engaging and learning from local stakeholders.

    • Collaborating and learning with local stakeholders could strengthen evaluation evidence to inform programming for self-reliance.

  1. Across the board, USAID evaluation questions still have a lot of room for improvement. With more support, they can be stronger.

    • Participants pointed out USAID evaluation questions are still not always researchable, limited in scope, or clear.

    • Several areas for increased technical support are in: refining self-reliance definitions and concepts; making better use of evaluation examples and templates; providing more context-specific evaluation training; and putting greater emphasis on the development of high quality evaluation SOWs.

  1. To meet the evaluation challenges of tomorrow, there needs to be more emphasis placed on utilization, adaptability, and collaboration.

    woman with microphone
    • Most participants agreed that USAID had a full range of methods, tools, and processes to evaluate strategies, projects, and activities.

    • To strengthen our evaluation approaches, increased attention should be given to utilization-focused evaluation, developmental techniques, and evaluation synthesis.

    • There was also interest in using "co-creation" methods to build evaluation into project design and strengthen stakeholder involvement in developing evaluation recommendations.

  1. Evaluation is still not well integrated throughout the USAID Program Cycle - but giving broader support to existing best practices could help.

    • Most participants noted that evaluation is still not fully integrated into the Program Cycle.
    • Specific ways to strengthen the use of evaluation evidence include: 1) using evaluation synthesis to promote learning within sectors and in cross-cutting areas, 2) promoting co-creation of evaluation recommendations, and 3) encouraging greater access to and use of evidence from evaluations in project design and strategy development.

This input from participants will contribute to LER’s efforts to strengthen evaluation practice and also contribute to the Agency’s Self-Reliance Learning Agenda.

So what’s next?

  • Factor input into relevant guidance materials. We will factor Summit proceedings into our existing evaluation toolkit and let staff know as they are ready.
  • More outreach. We will be reaching back out to participants, presenters, and partners to follow up as the Self Reliance Learning Agenda takes shape and we determine how evaluation will contribute to it.  Please stay tuned!


Straight off the Shelf: Unpacking your Utilization-Focused Learning Agenda

Oct 2, 2018 by Shannon Griswold Comments (0)

Previously we shared some of our reasoning behind the U.S. Global Development Lab’s efforts to develop a Utilization-Focused Learning Agenda (ÜFLA), which emphasizes program design for the last step of the Knowledge Cycle. We’re learning as we go and we don’t have it all figured out yet, but if an ÜFLA sounds like something that you’d like to try, we’re pleased to share with you some principles for developing and ÜFLA that we’ve learned over the course of implementing ours.

cycle graphic

ÜFLA Principles

ÜFLAs are behavior change interventions - design your ÜFLA Program and Theory of Change accordingly

A robust program design for a Learning Agenda might sound a little overkill, but getting to USE requires a behavior change intervention for the people who are meant to use the findings. We need to identify the intended users of the learning, as well as their motivations, incentives, and opportunities for action. By doing so we can avoid (or make plans to overcome) barriers to use, and leverage momentum and opportunities that already exist.

illustration of to list

But bias yourself toward action on implementation of the plan - don’t design it to death!

Remember that there is an opportunity cost to spending time and resources on design - if we take two years to answer a question, then another two years to support adaptation based on that answer, we may have missed the window in which that that adaptation could have made a real difference. Don’t let the “perfect” become the enemy of the “good enough” - remember that humans learn best by doing, so be willing to try some things even if they aren’t fully baked.

champions illustration

Engage champions and intended users of the findings from the beginning

Design and implementation of the ÜFLA should be a transparent and participatory process. If the people whose behavior we are trying to change (the USERS of your ÜFLA) are not feeling buy-in to the process and the content from the beginning, we are going to have a real uphill battle when you get to step 4 of the Knowledge Cycle. Identify users and champions (especially executive sponsors) early via stakeholder mapping and engagement and get their input or better yet, their ownership of the results and adaptations.

cycle of movement

Resource appropriately for ALL phases of the knowledge cycle

One of the biggest challenges for any Learning Agenda, and especially for ÜFLAs, is securing sufficient human and budgetary resources to implement it. It’s fairly straightforward to commission a few studies or even a synthesis, but keep in mind that if we are striving for USE of the evidence, it’s more likely to be used when it’s coming from a trusted colleague. We’ve experimented with “insourcing” some aspects of our ÜFLA by assigning our own staff to conduct original research. This also alleviates some of the pressure on the budgetary resources, but it comes with a tradeoff - the people conducting this research must have dedicated LOE to do it. Ideally, the insourced research and/or synthesis should be their primary job responsibility so that the research doesn’t succumb to the tyranny of the urgent over the important. And don’t neglect to resource the behavior change aspect of the ÜFLA program design! That behavior change effort will often be a bigger lift than the other steps combined, so if this area is under-resourced, we’ve set ourselves up for failure before we’ve even begun.

illustration of priority list

Only strive to answer questions for which we will actually USE the answers to change our decisions and/or practices

There are a lot of things out there that would be “nice to know”. We have a plethora of academic partners and other development actors that are well-placed to help fill these gaps. Our value proposition in this space is toward maximum utility or “need to know” information. We should answer questions that have a direct implication for making different decisions or engaging in different practices than we would have without that information.

illustration of people with lightbulb

Stop at “good enough” data and evidence for decision-making

While there is always more we can know, it’s important to recognize when knowing enough to act on that knowledge. Don’t fall into the “paralysis by analysis” trap. If we can spend 50% more money to become 5% more certain of the outcome, is that a good use of resources? Will 5% greater certainty change the decision we would have made without the extra evidence? If not, we should use our judgment and lean towards investing in implementation. (Of course, there is an important caveat: do no harm - If there is a reason to believe that the decisions made within that 5% uncertainty could cause active harm, this changes the equation.) Remember that there is both a financial and an opportunity cost to waiting for more evidence – it takes time to gather and analyze it, and that’s time that we’re not implementing. 

Keep your focus on who wants to use evidence, for what purpose, and what barriers they might encounter to use, and you’ll be on the right track!

The Winners of the 2018 Collaborating, Learning and Adapting Case Competition are…

Sep 12, 2018 by Amy Leo, Reena Nadler Comments (0)

The 2018 CLA Case Competition Winners and Finalists are...

Amy Leo is a Communications Specialist on the USAID LEARN contract and Reena Nadler is a Knowledge Management and Organizational Learning Specialist on USAID's CLA Team in the Bureau for Policy, Planning, and Learning. Amy and Reena co-manage the annual CLA Case Competition.

The fourth annual Collaborating, Learning and Adapting (CLA) Case Competition was held April 9 - May 31, 2018, and we received 127 case studies. Thank you, submitters!

The objectives of the CLA Case Competition are to:

  • Capture real-life case studies of USAID staff and implementing partners using a CLA approach for organizational learning and better development outcomes;
  • Identify enablers and barriers to CLA integration; and
  • Contribute to the evidence base for CLA.

In addition, the CLA Case Competition is an annual opportunity to check in on what’s happening with CLA integration throughout USAID’s programs. The increasing number of submissions over the past  four years (57 in 2015, 63 in 2016, 100 in 2017, 127 in 2018) indicates that CLA practices (or, at least, awareness of the competition!), is rising.

Here are some other takeaways from an analysis of this year’s cases:

In 2018, we encouraged USAID missions/operating units and implementing partners to co-submit cases. And you did! Last year, we only received 2 jointly submitted cases, and this year we received 24.

  • We continue to see a concentration of CLA integration in Africa, but we received nearly twice as many cases from Latin America and the Caribbean (LAC) this year--13, up from six in 2017.  

So, without further ado, here are the winners of the 2018 CLA Case Competition:

USAID/Paraguay’s case, Enhancing Organizational Culture for Improved Collaboration and Effectiveness, was a favorite among judges. The case transparently and candidly describes how the mission, led by Foreign Service Nationals, focused on and revived its organizational culture in the midst of a particularly challenging set of circumstances (think: a proposed budget of $0, no Mission Director or Deputy, uncertainty about reorganization). A year later, the mission reports that it “has restored most of the positive, passionate, and constructive culture that contributes to the success of USAID/Paraguay’s objectives.” Congratulations to author Laura Alvarez of USAID/Paraguay!

Collaborating to Build Local Government Monitoring and Evaluation Capacity in Peru, explains how USAID’s evaluation policy and an awareness of aid effectiveness principles led USAID/Peru to incorporate monitoring and evaluation capacity building as a specific focus of their assistance. Judges appreciated how the Mission used CLA to build local capacity, supporting USAID’s goal of fostering self-reliance. As a result of USAID/Peru’s CLA approach, regional governments are increasing the resources they dedicate to M&E and officials are expanding their use of M&E in the housing and sanitation sector and sharing information with the local population via government websites. Congratulations to authors Miriam Choy and Paola Buendia of USAID/Peru!

How CLA Improved Cooperation and Livelihoods in Central American Coffee describes how the Better Coffee Harvest program, a public-private partnership that is working to improve the livelihoods of smallholder coffee farmers in Nicaragua and El Salvador, leveraged its monitoring and evaluation activities to engage stakeholders and share information, ideas, and connections. Using a CLA approach helped Better Coffee Harvest meet its key performance targets and improve coordination across the larger coffee industry. Congratulations to authors Kate Scaife Diaz and Nick Rosen of TechnoServe!

Seeing in Systems, Working in Networks: CLA for Adaptive Peacebuilding in Myanmar describes a CLA approach in a highly conflict-affected region of northern Myanmar. Adapt Peacebuilding used CLA to help local communities sustainably drive their own peacebuilding and development outcomes with minimal international support in the midst of unpredictable and changing local dynamics. Using System Action Research (a methodology developed specifically for locally led change in complex environments), a consortium of local organizations designed and implemented activities that directly benefited more than 17,000 people and achieved several notable firsts. Congratulations to author Stephen Gray of Adapt Peacebuilding!

The Scoop on Poop: How Open Defecation Free Data Led to Activity Program Pivots in Ethiopia’s Lowland definitely wins the award for best title! This case  describes how USAID/Ethiopia and AECOM adapted Community-Led Total Sanitation and Hygiene (CLTSH) interventions developed for Ethiopia’s densely populated highland areas to the harsh, remote environment of the Ethiopian lowlands. This work involved gathering information about the new operating environment, engaging with local leaders, and implementing CLTSH activities accordingly. The Scoop on Poop is a helpful model for adapting effective approaches to new contexts. Congratulations to author Nikita Salgaonkar of USAID/Ethiopia and AECOM!

Collective Action, Collective Impact through Strategic Partnerships in Northern Kenya captures how USAID/Kenya and East Africa’s Partnership for Resilience and Economic Growth platform used work plans as an adaptive management and flexible programming tool. This was particularly beneficial during the 2017 drought and election shocks and stressors in Kenya. Implementing partners that built in flexibility into their work plans were able to have contingency plans and support county efforts to respond during the drought while mitigating for the delays in implementation due to the prolonged election process. Congratulations to authors Dorine Genga and Jennifer Maurer of USAID/Kenya and East Africa!

Improving Evaluation Use in Senegal through Recommendations Workshops explains how USAID/Senegal used a collaborative process to learn from a performance evaluation of one of their government-to-government agreements, and adapt accordingly. Judges were especially impressed by the Mission’s clear, step-by-step description of the process, which included an initial analysis the evaluation findings, preparing stakeholders for a recommendations workshop, and developing an action plan of prioritized recommendations following the workshop. The action plan was then used as a foundation for the design of the next phase of the agreement. Congratulations to authors Lisa Slifer-Mbacke of Management Systems International and Elizabeth Callender of USAID/Senegal!

In mid-2017, shifts in the development and conservation context indicated a need for USAID/Mozambique to manage adaptively. Seeing the Forest for the Trees: CLA Strengthens Conservation in Mozambique describes the learning activities that the mission used to reinforce the use of theories of change in biodiversity programming, improve collaboration and information sharing, and provide a clear pathway to improve activities’ MEL practice and outcomes. One outcome of their recent learning workshop is a new Conservation Community of Practice, which will allow conservation practitioners from different regions to share information for the first time. Congratulations to Olivia Gilmore of USAID/Mozambique and Kathleen Flower of USAID/Forestry and Biodiversity/Measuring Impact, with co-authors from USAID/E3/Forestry and Biodiversity.

Road Map for Collaboration: Addressing Health & Nutrition Disparities Across Rwanda details how USAID/Rwanda fostered collaboration in a  multi-sectoral nutrition project designed to address health and nutrition disparities in Rwanda. A project management team (PMT), including staff from both technical and support offices conducted a mapping exercise to visualize and document where activities were taking place, developed terms of reference to document how offices should work together, organized meetings to discuss implementation and promote alignment, and provided input during design discussions for new activities. They also participated in the design and development of plans to integrate evaluation recommendations and organized an end-of-year event for the team and partners to evaluate progress and reorient for next year.  The mid-term, whole of project evaluation found positive results from the PMT’s coordination and collaboration model. Congratulations to authors Mary de Boer & Linda Nico of USAID/Rwanda.

Before USAID Serbia’s Business Enabling Project (BEP) started in 2011, it was much harder to run a business in Serbia. However, USAID/Serbia found that economic policy reform is an ideal environment for CLA. Construction Permitting in Serbia: From 20 Stops to One Stop Shop describes how the mission used CLA practices to streamline the country’s construction permitting process with the introduction of an e-permitting system. This resulted in Serbia jumping to 10th place in the 2018 edition of the World Bank Doing Business Report in the Getting Construction Permits category. The reforms supported by USAID and BEP have resulted in record numbers of permits issued.The growth of the construction sector due to the streamlining of the permitting process has also increased the contribution of the sector to Serbia’s GDP to 6 percent of GDP.  Congratulations to authors Jelena Popovic, Aleksandar Djureinovic & Laura Pavlovic of USAID/Serbia!

These winning cases will be featured here on USAID Learning Lab in the coming months, and ultimately become a part of CLA canon--providing inspiration and direction to USAID staff and partners interested in achieving better organizational and development outcomes. Congratulations, winners and finalists!

Choosing just ten winners from 127 submissions was challenging, especially because the cases get stronger and stronger year after year. The judges’ consolation is that we also recognize an additional 20 cases as finalists. Click here to see the finalists from this year, as well as cases from previous years.. The rest of the top 75% of cases submitted this year will also be posted to this CLA Case Study collection in the coming months.

How Learning Networks in Kenya Are Strengthening Health Systems

Mar 11, 2014 by Wycliffe Omanya, Mathew Thuku, and Salome Mwangi Wycliffe Omanya, and Salome Mwangi Comments (0)

Evelyn Wambui receives a transformational award from Capacity Kenya’s, Mathew Thuku and CHAK’s Patrick KyaloThe IntraHealth International-led Capacity Kenya Project, funded by the U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) through USAID, was designed to strengthen and transform Human Resources for Heath (HRH) systems of public, private, and faith-based sectors to enhance the quality and equity of health service delivery and, ultimately, to improve health outcomes for the people of Kenya. To support knowledge sharing and learning, the project created Peer Learning Cycles (PLC). Evelyn Wambui, Human Resources Manager at Consolata Mission Hospital Mathari in Kenya’s Nyeri County, participated in the PLCs and shared some of her experiences for this article.

1. What was the primary goal of the PLCs?

In Kenya, faith-based organizations (FBO) have been critical to health systems strengthening and are important partners for the project. In coordination with FBO umbrella groups, the project conducted a baseline survey on 127 FBO facilities to determine the status of implementation of HRH management systems and practices. The baseline findings informed the initiation of a PLC platform for knowledge exchange, learning, and replication of HRH best practices. In 2010, the project supported the secretariats of the faith-based health facilities in the development of a Generic Human Resources Management Policy and Procedures Manual. The HR Manual harmonizes sound human resource management practices across FBO member health units. Additionally, it presents a helpful starting point from which to standardize, motivate, and retain health workers across various categories of employers.

Graphic of Peer Learning Cycle process

2. What was the biggest value for those involved?

The PLC platform brought health facility and HRH managers together to share experiences, implement strategies and approaches, and initiate virtual task management forums, inter-facility visits, supportive supervision, and quarterly progress-sharing workshops. For the first time ever, 47 health facilities—comprised of 39 hospitals, 6 health centers, and 2 medical training centers, all participated.

One of the participants, Evelyn Wambui,Human Resources Manager at Consolata Mission Hospital Mathari, describes how the teams developed action plans to implement the generic HR Manual.

From the PLC forums training, I have produced information leaflets and distributed them to all departments and followed up with internal sensitization exercises. I am happy that all our staff are now aware of their rights and Human Resources Policy requirements. Another best practice we implemented was establishing a functional disciplinary committee, which we never had in our facility. The PLC sharing platform enabled me to learn the criteria of forming an effective committee. I am glad to report that we now have a committee in place in the hospital and is doing a great job. We have managed two cases based on the approved grievance policy.

3. What are some lessons learned?

The sequence of activities toward increasing learning and improvement in HRH practices may determine the consistency and commitment with which members engage with each other. The figure below shows the continuum of improvement in HRH practices.

Improvement of HRH Practices graph

4. Are you attempting to measure learning? If so, how?

The PLC platform provided a space where HRH Managers could learn how to track performance improvements in their own staff. Evelyn Wambui discusses the use of the Performance Improvement Plan (PIP), a technique she learned while participating in the PLC.

Prior to our action plan on performance management, it was difficult to efficiently provide a measure of our employee’s performance levels, but out of the sharing in this forum, I identified how best to boost the employee performance.  I’m glad to say that we recently came up with a system known as a “PIP” which stands for the Performance Improvement Plan. In this plan, we identify staff performing below average in the facility and assign them a plan that measures their performance each quarter against specific targets. We currently have two employees with a PIP plan scheduled for an evaluation to determine their performance and identify areas of improvement.

Read more about Learning Networks in the Learning Network Resource Center.


CLA in Action articles are intended to paint a more detailed picture of what collaborating, learning, and adapting (CLA) looks like in practice. Unlike other disciplines, CLA is not a technical "fix;" it looks different in different contexts. This series will showcase examples of intentional collaboration, systematic learning, and resourced adaptation, some of which you may find applicable to your own work. The case studies, blogs, and resources represented in this series document the real-world experiences of development practitioners experimenting with these approaches for the benefit of sharing what's possible.

Subscribe to RSS - USAID LEARN's blog