This Discussion Note outlines key considerations for USAID staff and evaluators when deciding to conduct an ex-post evaluation and then planning for, designing, implementing, and using findings from ex-post evaluations. Those commissioning an ex-post evaluation should consult with an evaluation specialist and consider...
The Summary outlines the ways in which DI’s use of collaborating, learning and adapting (CLA) mindsets and practices contributed to its development achievements.
Reports on data quality for standard indicators of USAID Brazil's Amazon Biodiversity Conservation program
Technical Report on Making Evidence from Evaluations More Accessible to Decision-Makers
In its efforts to advance understandings of how to measure the effects and effectiveness of collaborating, learning, and adapting (CLA) on development results, the CLAIM Learning Network has identified a number of learning questions around assessing CLA’s 'plausible contribution' to development outcomes. These include:...
Summary report of findings from the 2016 MEL Platforms Assessment
The link provides guidance on the Evaluation Registry, which contains information on planned, ongoing and completed evaluations. This material is available for USAID staff only.
This document provides answers to frequently asked questions (FAQs) on the Evaluation Toolkit released in September 2015.
Missions and Washington OUs must ensure that USAID implementing partners submit datasets—and supporting documentation such as code books, data dictionaries, scope, and methodology used to collect and analyze the data—compiled under USAID-funded evaluations to the Development Data Library (see ADS 579 and Frequently...
When asked about barriers to evaluation quality and use, USAID staff often cite instances when the evaluation took place at the wrong time, focused on the wrong questions, or failed to engage stakeholders from the beginning.