Skip to main content
The USG Evaluation Forum; Celebrating 2015 As The International Year of Evaluation

2015 Year of Evaluation Forum

Important Dates

Forum
December 2, 2015

During 2015, evaluators in the international community and at the United Nations have gathered to exchange information on effective evaluation methods, tools and approaches to celebrate the International Year of Evaluation. On December 2nd, the U.S. Department of State continued that celebration, bringing together over 15 federal agencies that provide foreign assistance to create the U.S. Government (USG) Evaluation Forum.

With USG, private sector, multilateral development banks, and United Nations participation, the forum looked at the links between policy making and evaluation, strategies for facilitating learning and increasing evaluation use, and ways to do more with less—through partnerships, innovative planning and design, and interagency collaboration. Forum presenters discussed evidence and its uses, evaluability, and new methods for visualizing the data and evidence from evaluations.

The Sustainable Development Goals (SDGs) – Indicator Development and M&E Efforts Overview

On 6 March 2015, the United Nations Statistical Commission created an Inter-agency and Expert Group (IAEG) on SDG Indicators, composed of Member States with regional and international agencies as observers. After two rounds of open consultation, the IAEG met in Bangkok for October 26-28 to review and discuss the proposed indicators. This session will describe the process going forward to finalize the list of SDG indicators, associated efforts such as Sustainable Development Data and how evaluation could complement and strengthen the assessment of progress.

Presenters:

  • Cynthia Clapp-Wincek (Advisor for SDG Follow Up and Review)
  • Indran Naidoo (United Nations Development Programme)

Presentations:

Independent Evaluation at the Development Banks

In this session, the featured speakers discussed the role of independent evaluation at the World Bank Group and the Inter-American Development Bank (IDB). Topics included: the evaluation models and methods used at these institutions; maintaining the balance between accountability and learning; creating incentives and feedback loops to strengthen the use of learning and knowledge in programming; current priorities of the World Bank Group and the IDB’s independent evaluation units; and how these units can better partner with peer evaluators and collaborate with the greater evaluation community, including the USG evaluation network.

Presenters:

  • Caroline Heider (Independent Evaluation Group, World Bank Group)
  • Cheryl Gray (Office of Evaluation and Oversight, IDB)

Moderator:

  • Alexia Latortue (U.S. Department of Treasury)

Presentation:

Strategies to Facilitate Evaluation Utilization and Learning

In this session, representatives from both within and outside the US government discussed strategies for facilitating evaluation utilization and learning. Lessons were drawn from evaluation capacity development overseas as well as from US agencies’ efforts to influence policy and the scale-up of evidence-based practices. Evaluators were encouraged to maintain a close connection to policy discussions, build a broad portfolio of evidence, plan evaluations with the users in mind, and help your audience understand the evaluation and its implications.

Presenters:

  • Stephanie Shipman (U.S. Government Accountability Office)
  • Virginia Lamprecht (U.S. Agency for International Development)
  • Richard Lucas (U.S. Department of Agriculture)
  • Lauren Supplee (U.S. Department of Health & Human Services)
  • Tessie Catsambas (EnCompass LLC)

Presentations:

Leveraging Evaluation in an Emergency Response: The President's Emergency Plan for AIDS Relief (PEPFAR)

Evaluations have been an integral component of PEPFAR since its inception. The initiative’s size, its complexity and its aggressive scale-up imposed a variety of constraints on the ability to monitor and manage these activities well. A review by the GAO provided incentive to respond to these circumstances, resulting in the publication of the PEPFAR Evaluation Standards of Practice. The PEPFAR Evaluation Standards of Practice were designed to focus on operational issues (as opposed to policy issues) and drew heavily from standards defined by the American Evaluation Association. This session included a brief description of the PEPFAR initiative as well as presentations by each of the primary implementing agencies of PEPFAR of lessons learned from increasing evaluation and monitoring efforts and from the evaluations themselves.

Presenters:

  • Irum Zaidi (Office of the U.S. Global AIDS Coordinator)
  • Paul Bouey (Office of the U.S. Global AIDS Coordinator, U.S. Department of State)
  • Paulyne Ntube Ngalame (Centers for Disease Control and Prevention)
  • Yamir Salabarria-Pena (Centers for Disease Control and Prevention)
  • Maureen Goodenow (Office of the U.S. Global AIDS Coordinator)
  • Lily Asrat (U.S. Agency for International Development)
  • Vienna Nightingale (Department of Defense)
  • Michael Melchior (Peace Corps)

Presentation:

Ways to Successfully Build Evaluation Capacity at Your Agency

This moderated discussion focused on ways participants could work with their leadership, and within the existing structure of their organization, to build evaluation capacity. The session began with short vignettes from the diverse perspectives of the panelist but quickly expanded to address audience questions about topics that included the need to adapt to current trends and needs in monitoring and evaluation, the place of impact evaluations, the importance of data visualization, how to leverage big data, and data collection solutions for surveys with limited resources.

Panelists:

  • Molly Irwin (U.S. Department of Labor)
  • Delia Welsh (Mathematica Policy Research)
  • Alicia Philips Mandeville (Amida Technology Services)
  • Diana Harbison (U.S. Trade and Development Agency)
  • David Yokum (U.S. General Services Administration)

Moderator:

  • Celeste Tarricone Lemrow (U.S. Department of Labor)

Modalities for Collaborative Evaluation of International Organizations and Their Programs

The Multilateral Organization Performance Assessment Network (MOPAN) is a group of major donor countries that jointly assess the major multilateral organizations they fund. Representatives from the MOPAN Secretariat discussed the opportunities and challenges of managing collaborative evaluations with 19 donor countries and the introduction of a new methodology in 2015; MOPAN 3.0. This new approach increases the number of organizations evaluated in each assessment cycle, broadens the range of organizations evaluated, and includes more countries where organizations operate. It also includes a greater focus on development effectiveness while still evaluating aspects of organizational effectiveness.

Presenters:

  • Brigitte Malenfant (Multilateral Organisation Performance Assessment Network Secretariat)
  • Katie Vanhala (Multilateral Organisation Performance Assessment Network Secretariat)
  • Celeste Lemrow (U.S. Department of Labor) Lili-Marguerite Stern (U.S. Department of Labor)

Presentations:

Innovations in Funding Evaluation and Research

The success and utility of research and evaluations focused on estimating impact often depend on the integration of research and evaluation in project design and collaboration between researchers and implementers. In this session, presenters discuss innovative ways to fund research and evaluations in international development programs. USDA’s Poverty Action Lab (J/PAL) provided a best practice model for matchmaking between researchers and implementers. USAID’s Development Innovation Venture offered a tiered evidence-based approach for applying for aid that initially only requires the submission of a proof of concept plan. Additionally, DIV offers technical assistance in the following: monitoring and evaluation, technical solutions, social media, financing and operations. World Bank’s Development Impact Evaluation group described the technical assistance it provides from the planning phases forward to enable program implementers to develop and perform impact evaluations of projects and programs.

Presenters:

  • Ben Jaques-Leslie (J-PAL Global)
  • MC Dinh (U.S. Trade and Development Agency)
  • Arianna Legovini (World Bank DIME)

Moderator:

  • Amy R. Ritualo (U.S. Department of Agriculture)

Presentations:

Working Across Agency Lines: A Roundtable Discussion on Collaborating within the USG

With shrinking resources, demands for more accountability and programs and projects with overlapping geographic or sector focus, how can USG agencies collaborate on evaluation? This roundtable discussion with evaluation leaders from across the USG imparted experiences and explored the barriers, opportunities and potential avenues for collaborating across agency lines.

Presenters:

  • Gregory Larson (U.S. Treasury)
  • Diana Harbison (U.S. Trade and Development Agency)
  • Jack Molyneux (U.S. Department of State)
  • Negar Akhavi (U.S. Agency for International Development)
  • Eileen Cronin (U.S. Department of State)

Moderator:

  • Stephanie Shipman (U.S. Government Accountability Office)

Presentations:

Evidence Works!

Bethanne Barnes, special advisor for evidence-based policy at the Office of Management and Budget, gave an overview of OMB’s development of evaluation-related policies within the U.S. Government. She outlined policies intended to support smart, rigorous evaluations and the integration of evaluation results into improved design and implementation of government programs. Included in the discussion were ways to match needs for evidence with appropriate tools, the design tradeoff between evaluations to assess effectiveness versus those to improve program performance, and principles and practices (such as rigor, relevance, independence, transparency and ethics) that can strengthen evaluation. Demetra Nightingale, the chief evaluation officer at the U.S. Department of Labor, discussed building and implementing an evaluation program, including models to exchange and strengthen good practices such as the Common Evidence Framework working group and exploring ways to standardize the language around evaluation in the government.

Presenters:

  • Bethanne Barnes (Office of Management and Budget)
  • Naomi Goldstein (U.S. Department of Health & Human Services)
  • Demetra Nightingale (U.S. Department of Labor)

Presentations:

Evaluability: Experiences and Applications to Private Sector Development Projects at Millennium Challenge Corporation and Inter-American Development Bank

Presenters:

  • Sixto Aquino (Millennium Challenge Corporation)
  • Yuri Soares (Multilateral Investment Fund)
  • Alejandro Pardo Vegezzi (Inter-American Development Bank)

Data Visualization: Tools and Techniques to Enhance Data Use and Drive Impact

As evaluators, we often face the challenge of effectively communicating data to a broad variety of audiences – policy makers, program teams, project beneficiaries, and taxpayers, to name a few. Data visualization is a powerful tool that can help evaluators convey key findings and foster data use to drive impact in federal foreign assistance programs. This session highlights the importance of data visualization and a few principles of good design that can be put into practice using widely accessible resources. The first presentation, delivered by Dave Shellard, the Program Chair of the American Evaluation Association's Data Visualization and Reporting Topical Interest Group, introduces key concepts and techniques grounded in the science of how humans understand data. Dave Shellard has been at the forefront of setting the data visualization agenda and strategy for the program evaluation community. His interests focus on the effective communication and presentation of data for evaluation and performance measurement projects in a resource-constrained environment. The second part of the sessions presents insights from the State Department, Census Bureau, and the US Trade and Development Agency on the challenges and opportunities for utilizing data visualization more broadly within the US Government foreign assistance community. Amin Vafa from the State Department will present an exciting new tool, the F Interagency Network Databank. Kendra Kintzi from the US Trade and Development Agency will share insights and techniques for creating visualizations on a shoestring budget to drive internal learning and evaluation use. Meade Turner will discuss how the Census Bureau has fostered a growing culture of data visualization, and will demonstrate more advanced techniques for sharing visualizations with external audiences. This session is designed to impart a few practical tools and ideas that can help us all as we seek to continually strengthen our evaluation practice and deliver better results to our partners around the globe.

Presenters:

  • Dave Shellard (American Evaluation Association)
  • Kendra Kintzi (U.S. Trade and Development Agency)
  • Amin Vafa (U.S. Department of State)
  • Z. Meade Turner (U.S. Census Bureau)

Presentations:

Video Presentations:

  • "Creating Good Data Visualization(#dataviz) in a Resource Constrained Setting” In this opening plenary session Dave Shellard, Program Chair of the AEA’s Data Visualization and Reporting Topical Interest Group, highlighted the importance of data visualization and various principles of good design that can be put into practice using widely accessible resources. He illustrated how evaluators can use visual design to effectively communicate data to a variety of audiences and help convey key findings fostering data use to drive impact in federal foreign assistance programs.
  • “Data Visualization on a Shoestring” In this breakout session Kendra Kintzi from the US Trade and Development Agency shared insights and techniques for creating visualizations on a shoestring budget to drive internal learning and evaluation use.
  • “The F Interagency Network Databank (FIND)” In this breakout session Amin Vafa, Policy Analyst, Office of Foreign Assistance Resources at the U.S. Department of State, presented an exciting new tool, the F Interagency Network Databank (FIND).
  • “Data Visualization for Growing Your Audience” In this breakout session, Meade Turner discussed how the U.S. Census Bureau has fostered a growing culture of data visualization and demonstrated more advanced techniques for sharing visualizations with external audiences. This session was designed to impart a few practical tools and ideas that can help evaluators strengthen skill sets that deliver better results to foreign assistance partners around the globe.