Skip to main content
Community Contribution

Learning from Evaluations

Mar 06, 2015
Jacob Laden

Understanding the importance of evaluations to learn about development effectiveness is critical to USAID and its partners as it promotes learning and adapting across the agency. USAID’s 2011 Evaluation Policy seeks to improve the quality and quantity of evaluations. However, there’s still room to optimize the process of using and learning from these evaluations and incorporating adaptive management to improve development effectiveness.

Evaluation debates often focus on how best to capture impact while glossing over how the evaluation results and recommendations will actually be applied to adapt projects in ways that will improve development effectiveness. Key takeaways, lessons learned, and recommendations often get lost in a dense evaluation report with hundreds of pages of information replete with annexes and addenda. Without a plan in place for how to apply the knowledge generated by the evaluation, the process may end with submitting the report to the Development Experience Clearinghouse (DEC). Evaluators and decision-makers often struggle to understand what happened after the evaluation or explain how the knowledge produced from the evaluation was later used.

How do we ensure evaluations don’t collect dust? 

An evaluation is often misperceived as a report card for the project. It is more beneficial to see it as a collaborative roadmap for action decisions and future planning. Several key steps taken in the early stages of an evaluation process can help support learning from the evaluation. These include determining the evaluation’s audience and how they want to receive the evaluation information before designing it and improving how recommendations and lessons from the evaluation are developed. 

Evaluation use is one of the key things to think about at the start of the evaluation process—when the SOW is still a work in progress. Include a dissemination plan for the evaluation in the SOW to help ensure that it doesn’t become an afterthought. Evaluations should be planned and conducted in ways that enhance utilization of results to inform decisions and improve performance. 

While defining the key questions an evaluation seeks to answer, consider: 

  • How will the evaluation be used? 
  • Who will use the evaluation and what is important to consider for each evaluation stakeholder? 
  • What decisions and actions will depend on the evaluations findings and conclusions? 

USAID and other programming or project staff who answer these questions in the SOW development stage will be better able to narrow the scope of evaluation questions, which will in turn help the evaluation team identify and articulate more useful and relevant findings.

Once the target audience is identified, strategize dissemination to ensure that users receive the evaluation information in a relatable way. Think about messaging and channels of communication that may differ among program officers, country and implementing partners, and other stakeholders. Think creatively about how to repackage and communicate evaluation information apart from the report in ways that are digestible and will resonate with these key users. 

Key opportunities for learning are often found in the recommendations section of the evaluation. Useful evaluations that include recommendations should be developed in a participatory environment with users’ input rather than in a vacuum. When the recommendations are developed in isolation from the intended user, it can hinder buy-in and the evaluation’s influence on future program design. This is why some evaluators fear making recommendations at all. 

Including the evaluation users in developing recommendations will help ground-truth their feasibility. The users may also provide insights regarding budget and political will. Without this user feedback, evaluation recommendations may be irrelevant or difficult to implement.

Evaluation recommendations should:

  • have buy-in from key users
  • be supported by specific findings and conclusions of the evaluation
  • be practical, focused and action-oriented
  • indicate who will be responsible for the action
  • be feasible to operationalize. 
  • reach their intended audiences.

Engaging Audiences and Facilitating Evaluation Use 

Once the evaluation is complete and recommendations are developed, evaluators should engage with key mission stakeholders to plan actions and next steps. Evaluation staff should encourage evaluation users and other influencers to share lessons from evaluations to improve development effectiveness. Whether senior leadership or rank and file, staff should seek ways to encourage use of evaluation results and data within missions and across the development community at large.

USAID’s ADS Chapter 203: Assessing and Learning provides specific guidance on steps to take upon completion of an evaluation:

  • Debrief the mission team or office and discuss results or findings 
  • Review the key findings, conclusions, and recommendations systematically with USAID staff and stakeholders 
  • Determine whether the team accepts/supports each finding, conclusion, or recommendation; 
  • Help USAID to Identify any management or program actions needed and assign responsibility and the timelines for completion of each set of actions.

In addition, USAID should:

  • Determine whether any revision is necessary in the joint country assistance strategy or USAID country development cooperation strategy, results framework, or project, using all available information; and 
  • Share and openly discuss evaluation findings, conclusions, and recommendations with relevant customers, partners, other donors, and stakeholders, unless there are unusual and compelling reasons not to do so. In many cases, the USAID Mission/Office should arrange the translation of the executive summary into the local written language.

Evaluation findings should be not only accessible (where appropriate), but also actively shared to help ensure that the recommendations make it into the hands of key decision-makers.

Following up

How do we know if a project or program was adapted according to an evaluation if we don’t have a system or process to follow up? Evaluators, leadership, and others should follow-up with program and project staff to take stock of how an evaluation was used and what it did to develop learning and adaptation for the project, program, or even mission. A key play to look is in USAID mission country development cooperation strategies, which should reflect evaluation findings.

Follow-up questions may include:

  • Did the agreed changes happen? 
  • If so, how did evaluation findings and conclusions get used? 
  • What were the key lessons learned? 
  • How if at all did learning lead to course correction in the project or the Mission’s approaches?
  • What planned actions have yet to be taken?
  • What support is needed to see this through?

The Courage to Adapt

It takes courage and a little patience to reflect and use evaluation lessons for adaptive management. Often, technical teams have already embarked on planning Phase 2 of a project before the evaluation of Phase 1 has even begun. Adaptive management means sometimes rocking the boat. When possible, time for evaluations and adaptive management should be built into project work plans to avoid barriers to modifying a project. At any time though, the evaluation process is a time to learn, adapt, and integrate new knowledge into project design. Modifications to plans and budgets are a pain, but it would be worse to ignore considerations that could dramatically improve results. 

Next steps

PPL is currently conducting an evaluation utilization study, so there will be more on this topic to come.

If you’re interested in championing learning from evaluation and would like to share with others of like minds in the evaluation community at USAID, consider joining the Evaluation Interest Group on Learning Lab. 

What are your experiences with promoting evaluation lessons? Do you have an example where evaluation findings and recommendations made a difference? USAID wants your feedback! Share your experience here.