Four Takeaways from USAID's First Evaluation Summit
This post was written by Frank Higdon, Evaluation Team Lead, Office of Learning Evaluation and Research, Bureau for Policy, Planning and Learning
As USAID reorients how we do business to focus on supporting countries on the Journey to Self-Reliance, how can evaluation support that shift and how can we best use the lessons of the past to program for the future? In September 2018, more than 180 USAID staff and partners gathered in Washington, DC to begin to answer those questions. USAID’s inaugural Agency-wide Evaluation Summit, titled “The Role of Evaluation on the Journey to Self-Reliance,” provided a timely opportunity for Agency staff and partners to engage.
The Summit’s objectives were to advance knowledge about evaluation approaches and inform an evaluation action plan for self-reliance; contribute to strengthening USAID evaluation approaches that support countries on their Journey to Self-Reliance, which includes improved understanding of sustainability, local ownership, and local systems; and provide an opportunity for peer-to-peer sharing and collaboration.
The event featured more than 20 sessions and 50 presentations and allowed approximately 100 USAID staff from 50 missions and headquarters to share and learn from each other as they considered the important role that evaluation plays in supporting self-reliance. On the final day of the Summit, about 85 external participants from other federal agencies, firms providing evaluation services, and organizations advocating for strong evaluation practices joined to share their insights and perspectives.
The Bureau for Policy, Planning, and Learning’s Evaluation Team, who hosted the event, distilled discussions during the Summit into four major takeaways:
USAID's body of evaluation evidence could be strengthened through increased collaboration with local stakeholders.
Participants were evenly divided on whether the agency has a solid body of evaluation evidence to inform strategy and project design with self-reliance outcomes.
The strongest evaluation evidence was noted in these areas: 1) local systems strengthening and capacity development; 2) engaging and strengthening local partnerships; and 3) engaging and learning from local stakeholders.
Collaborating and learning with local stakeholders could strengthen evaluation evidence to inform programming for self-reliance.
Across the board, USAID evaluation questions still have a lot of room for improvement. With more support, they can be stronger.
Participants pointed out USAID evaluation questions are still not always researchable, limited in scope, or clear.
Several areas for increased technical support are in: refining self-reliance definitions and concepts; making better use of evaluation examples and templates; providing more context-specific evaluation training; and putting greater emphasis on the development of high quality evaluation SOWs.
To meet the evaluation challenges of tomorrow, there needs to be more emphasis placed on utilization, adaptability, and collaboration.
Most participants agreed that USAID had a full range of methods, tools, and processes to evaluate strategies, projects, and activities.
To strengthen our evaluation approaches, increased attention should be given to utilization-focused evaluation, developmental techniques, and evaluation synthesis.
There was also interest in using "co-creation" methods to build evaluation into project design and strengthen stakeholder involvement in developing evaluation recommendations.
Evaluation is still not well integrated throughout the USAID Program Cycle - but giving broader support to existing best practices could help.
- Most participants noted that evaluation is still not fully integrated into the Program Cycle.
Specific ways to strengthen the use of evaluation evidence include: 1) using evaluation synthesis to promote learning within sectors and in cross-cutting areas, 2) promoting co-creation of evaluation recommendations, and 3) encouraging greater access to and use of evidence from evaluations in project design and strategy development.
This input from participants will contribute to LER’s efforts to strengthen evaluation practice and also contribute to the Agency’s Self-Reliance Learning Agenda.
So what’s next?
- Factor input into relevant guidance materials. We will factor Summit proceedings into our existing evaluation toolkit and let staff know as they are ready.
- More outreach. We will be reaching back out to participants, presenters, and partners to follow up as the Self Reliance Learning Agenda takes shape and we determine how evaluation will contribute to it. Please stay tuned!