We Got Your Feedback: What Evidence is Most Useful to You?
Kat Haugh is a Monitoring, Evaluation, Research and Learning Associate on the USAID LEARN contract.
Recipients of the USAID Learning Lab newsletter may remember that we sent a poll in July 2017 in order to align our annual literature review update with USAID Learning Lab users’ interest in and need for evidence. Now that our literature review update has been released, we are ready to share back about the results of the poll and how they shaped the design and rollout of the literature review update.
Our Questions and Your Responses (n = 207):
1. I am most interested in learning more about the relationship between … (Select up to THREE)
- Monitoring, evaluation, & learning and program performance - 52.66 %
- Knowledge management and program performance 43.96 %
- Adaptive management and sustainable development 41.06 %
- Locally-led approaches and implementation success 39.13 %
- Leadership and the creation of a learning organization 35.27 %
- Strategic collaboration and organizational effectiveness 32.37 %
- Reflection and improved performance 18.36 %
- Individual traits and the ability to adapt based on learning 12.56 %
- Trust and team performance 12.08 %
- Other 10.14 %
The “Other” responses include:
- How to communicate learning in a meaningful way to stakeholders
- Local business initiatives development
- Strategic results and budget allocation
- Training of staff and outcomes in the field
- Specifically how knowledge management can improve results-based management
- Designing for adaptiveness in awards/logic models
2. If I were to receive information on the topics selected above, I would use that information to… _________________ (OPEN ENDED)
Most respondents wrote about how they would use the evidence to increase individual and team efficiency and effectiveness. They are looking for practical processes and procedures that they can use to operationalize evidence in support of CLA.
The word cloud above displays the frequency of each word in the responses. The larger the word, the more frequently it was used.
More broadly, respondents said that they would (in order of most frequently mentioned to least):
- Apply to organization/projects (Increase their capacities and skills to improve their office’s learning culture, work relationships, and work environment)
- Apply individually (Apply the evidence themselves to improve performance, grow professionally, and excel in their careers)
- Share with others/advocate for funding (Share with organizational leadership, stakeholders, and program staff; advocate for funding for key factors that lead to development success)
- Update existing OD tools based on new research
Some specific ways they said they would use the evidence to increase individual and team efficiency and effectiveness:
- Drive learning, leadership and knowledge management within my institution and among my network partners.
- Be better able to build trust within the team
- Develop a knowledge management system that incorporates processes to capture and store knowledge across the program. The final part would be producing knowledge products or formats that enable knowledge to be shared across the wider development community.
- Lobby for more evidence-based flexible adaptive programming in funded programmes
- Make my own team more reflective and actually use the M&E data we generate.
- Improve the systematic gathering, retention, and processing of lessons learned from both within and from outside of our organization, and second, establish better co-learning collaborative and strategic partnerships
- Design additional or better sustainability measures into our programs, use MEL data more effectively, and better utilize the observations coming from reflection to improve program performance.
- Advocate for greater attention to and investments in the factors found to be most supportive of better development outcomes
- Improve my work performance and can better help the organization to achieve it is goal and objectives
These findings shaped the fall 2017 update to our CLA literature review in two key ways. First, while conducting the literature review, we made sure to specifically focus on evidence that you said was most useful to you. We know from the literature and our own experience that evidence-informed decision making is most likely to occur when decision makers themselves demand, define, and interpret the evidence. So, we followed your lead. For example, you shared that you found evidence on the relationship between monitoring, evaluation, learning, and program performance to be most useful. As a result, we prioritized gathering evidence on that topic and others that matter most to you.
Secondly, when asked how you would use the evidence, you shared that you would apply it individually, within your organization, and would share with others to advocate for CLA approaches and practices. As a result, we aimed to make the evidence as actionable as possible by including specific recommendations from studies, making it easy to understand, and linking evidence to practices outlined in the CLA Toolkit.
Find the literature review update here and let us know what you think!