Three Insights on Activity Monitoring, Evaluation & Learning Plans

Aug 26, 2020 by Monalisa Salib Comments (0)
COMMUNITY CONTRIBUTION

Monalisa Salib is Social Impact's Sr. Director for Organizational Learning, USAID/Learns

Under the USAID/Vietnam Learns contract, Social Impact recently updated its Activity Monitoring, Evaluation & Learning Plan (AMELP). This exercise reinforced insights that we have long promoted with USAID and implementing partners (IPs) on AMELPs that can be applied to a variety of Activities.

 1.     Think of the AMELP as a critical management tool, not a Monitoring & Evaluation tool.

While the AMELP is indeed a MEL tool (it is in the name after all), it is more accurate to consider the AMELP a management tool akin to, or at the same level of importance as the work plan.

This is critical because the work plan often lists interventions without reference to the bigger picture. Without an AMELP, the implementer and USAID do not articulate their higher-level results or how they will know if they have achieved them. By framing and using an AMELP as a management tool, teams ensure they are focused on achieving results and avoid having AMELP development relegated only to MEL staff.

 2.     There are certain questions we should all be asking ourselves as we implement programs.

In the Learns AMELP, we designed our MEL activities based on key learning questions. Here is our list of learning questions that are relevant to every AMELP and why they are so important:

Learning Priorities for Management and Overall Effectiveness

Why is this important?

Progress towards results: Is the Activity achieving intended results and outcomes? Why or why not?

 

Because the Activity is designed to create some sort of change, AMELPs need to clearly articulate how partners will track whether change is occurring. Knowing why (or why not) the change is occurring helps us make adjustments or replicate success.  

Negative consequences: Have there been any unintended negative consequences because of the Activity?

Humanitarian assistance professionals are well-versed in the principle of “Do No Harm.” However, this can be less deliberate in development programs, despite the possibility of negative consequences. We need to be intentional in finding out whether we are doing harm so we can adapt if necessary.

Context shifts: How are shifts in context affecting the Activity's ability to achieve results or creating new opportunities for impact?

Though many implementers are regularly monitoring context, they rarely are intentional in planning for it. While everyone has now been forced to monitor context due to COVID-19, there have always been other shifts that affect our ability to achieve results.[1]

Feedback from end users: What feedback do program participants and end users have on our performance?

Our programs are meant to impact other people’s lives, but it is relatively common to review an AMELP that has no approach for receiving feedback from end users. Implementers often get this information informally, but we should aim to be purposeful and have mechanisms to both capture this information and act on it.[2]

 

3.     Focus your remaining learning priorities on your theory of change, particularly areas where you are less confident.

Given often limited time and resources, IPs should focus their time and attention on key questions stemming from their theories of change (ToCs) and their relative confidence levels in the causal links within that ToC. For example: if you do x, how confident are you it will work and get you to y? Areas with lower confidence levels are rich for learning. You do not want to wait until a final evaluation to reveal an approach does not work.

We know many partners are currently revisiting their work plan (and hopefully AMELPs) with the start of a new fiscal year. I hope these insights help you, and welcome insights from others on your experience using AMELPs.



[1] For more on context monitoring, see the USAID Monitoring Toolkit.

[2] As a further resource, Feedback Labs does excellent work on creating and using feedback loops.

 

This post originally appeared on the Social Impact blog here

COMMENTS (0)