Promoting Peace in Burundi through Positive Youth Development

Aug 30, 2021 by Mehreen Farooq Comments (0)
COMMUNITY CONTRIBUTION

Afflicted by nearly two decades of civil war, Burundi, one of the countries with the youngest populations in the world, continues to face challenges in establishing long-term peace and stability within its borders.

In recent months, the security situation has deteriorated, with militants conducting violent ambushes in central Burundi and in the capital city of Bujumbura. A lack of economic opportunities for youth and unequal distribution of resources and wealth are among key drivers of instability. In addition, ethnic divisions between the formerly dominant Tutsi minority and the Hutu majority continue to sow social discord, with young people increasingly at risk of engaging in violence to resolve socio-economic grievances.

To address these challenges, Counterpart International is implementing Turi Kumwe (“We Are Together”) a USAID-funded peer-to-peer peacebuilding activity (2020-2023) through positive youth development (PYD).

In partnership with local organization, Jumelage Jeunesse pour le Bien etre des enfants et des jeunes, we utilize Village Savings and Loan Associations (VSLAs), a popular mechanism for individuals, particularly those who do not have access to formal financial services, to save and invest their income into personal or communal activities. To date, 35 VSLAs have been formed with over 1,050 participants in six locations throughout the country.

In addition to learning about income generating activities and savings, our VSLAs are designed to foster social cohesion. Our conflict-sensitive facilitators intentionally include youth from diverse backgrounds so that engagement in the association can foster trust and mutual understanding between different ethnic communities.

Applying our Missing Voice Analysis – Counterpart’s newest political economy analysis tool – we identified historically marginalized populations to include such as persons living with a disability, youth with albinism, Twa (an indigenous Pygmie ethnic group), single mothers, and internally displaced peoples and returnees.

VSLA participants also gain peacebuilding skills such as conflict analysis and management.  Through dialogue, participants discover shared values, transform biases, and begin to appreciate differences between different identity groups. Over time, the goal is for participants to foster constructive complex identities as a means to reduce ‘us-versus-them’ perceptions that fuel communal violence.

Within a few short weeks of implementation, the VSLAs are already having substantial impact on strengthening key PYD domains and giving youth new direction.

“I am unemployed. Before, I would simply wait for my husband to give me some money. With the trainings, I understood I can do something even with a small amount of money. One day, my husband gave me 1,000 BIF (0.51 USD) and instead of buying beer, I bought avocadoes and resold them for profit on another day when no market takes place, so prices are higher. I did that a few times, then again with sweet potatoes. In one month, my capital increased to 17,000 BIF (8.67 USD). This small activity will help me to find the weekly savings I need for the VSLAs. Then I can take a loan from the VSLA and grow my business even further.”

SPECIOSE NIZIGAMA, 28 YEAR-OLD UNEMPLOYED YOUTH FROM KIREMBA

To ensure that diverse participants – including members of historically marginalized communities – have meaningful engagement in the VSLAs, our approach includes building the capacity of participants prior to key activities. This includes building their confidence and advocacy skills to meet their community’s needs.

Our training includes public speaking training and hands-on exercises. Over the course of the training, facilitators noted significant improvement in many participants’ ability to speak clearly and coherently in these exercises compared to the start of the session.

For some youth participants, particularly those affected by trauma, tackling self-doubt was one of the greatest challenges. Unaddressed, it can lead to withdrawing from society, and losing confidence in their own self-worth and personal ambitions. Our facilitators worked with youth to set personal goals and provided guidance on how to boost their self-esteem.

“After my mother died, my father and mother-in-law started to persecute me. It led me to having bad behaviors and I got pregnant. After I gave birth, my family chased me away and my life became a nightmare. I had no hope and often had suicidal thoughts. When I heard about the Turi Kumwe project, I did not hesitate to register, thinking it was my opportunity to change my life. I was happy to be selected and this training opened my eyes. I understood that my biggest enemy is my own fear. I will now start participating in community meetings and activities and seek advice from those who were successful in their business and life. I am starting to think about my own life project.”

ALINE NIZIGAMA, 23 YEAR OLD SINGLE MOTHER FROM KIREMBA

Counterpart looks forward to building on these experiences to further strengthen youth competencies so that they can channel their energies into positive actions.

Filed Under: Learning in Action

COMMENTS (0)

More Collaboration, Bigger Impact: Digital Co-creation for Women's Empowerment in Egypt

Aug 30, 2021 by Charlotta Sandin and Adriana Abreu-Combs Comments (0)
COMMUNITY CONTRIBUTION

In essence international development work aims to effect positive change and achieve impact.  To maximize the intended impact, the United States Agency for International Development (USAID) has been exploring  the co-creation method: an approach that emphasizes collaboration between USAID’s many stakeholders throughout its project lifecycle, including at the earliest stages of scoping and planning. The idea represents a new way of doing business that achieves greater impact by intentionally engaging a diverse set of individuals and organizations – especially those with experience on the ground in the spaces and sectors that USAID works – to generate the best solutions to a myriad of development challenges.

USAID’s Mission in Egypt has joined the ranks of other Missions pursuing this high-impact approach, hosting its first-ever co-creation event last month. The session was designed to address the issue of women’s social and economic empowerment in the country, which according to the 2021 Global Gender Gap Report ranks 129th out of 156 countries for a variety of gender equity measures.

The goals of the co-creation workshop were multifold. It sought to encourage partnerships and alliances among organizations who had submitted concept notes about a proposed USAID-funded Women’s Economic and Social Empowerment activity, so that the agency would be better placed to tackle the multiple dimensions of its work; to facilitate brainstorming and debate on how to achieve scale and impact, to mine the best ideas; and to obtain valuable input from key stakeholders at USAID, the Government of Egypt and other local organizations with deep knowledge of the Egyptian context.

The workshop broke new ground on many fronts, thanks to the leadership of the Democracy and Governance team at the Mission in close coordination with our QED Egypt team. It was not only the first co-creation event by the Mission, but, due to the constraints of the COVID-19 pandemic, also one that utilized innovative methods of engaging and workshopping virtually. This included hosting participants in virtual speed-dating rooms that broke the ice and allowed participants to connect informally, as well as applying the Shark Tank methodology in which integrated teams that included members from various organizations collaborated to pitch ideas to USAID, government representatives, and participants at large.  

The workshop was a strong demonstration of the Mission at its best – that is, seeking advice from diverse stakeholders, building local capacity, and putting its collaboration, learning and adapting tenants and adaptive management approach into action to foster buy-in and impact.

The Mission expects final proposals (and the subsequent award) to be strengthened by the findings and discussions of the event. As one participant noted: the co-creation process meant “inspiring ideas and expertise were shared among partners.”

If there’s anything required for bigger impact in global development, it must be inspired approaches guided by deep, local expertise.

Using Digital Survey Tools to Listen to Women

Aug 25, 2021 by Kalkidan Lakew and Eric Kaduru Comments (0)
COMMUNITY CONTRIBUTION

In the COVID-19 context, where face-to-face surveys are no longer a safe option, digital surveys offer a timely and effective solution. At CARE, we've used mobile surveys, developed new digital qualitative data collection tools, and applied Interactive Voice Recording (IVR), which is a remote survey and messaging mechanism that enables us to poll community members, particularly women, and share critical messages about COVID-19, gender, and safety issues during the pandemic. Through CARE's Women Respond Initiative, we have targeted communities, mostly women, to listen to their experiences in the COVID-19 context.

Below are some of our key learnings and tips to better reach out to communities, particularly women, based on experience in Malawi, Tanzania, Nigeria, Burundi, and Uganda.

 

  1. Be clear about your survey goals: You can’t cover long questions in a digital survey, so be very clear and specific on what is important for your work and who you are targeting. Clarity on your survey goal can help you define what platform to use, how to reach out to your target group, and when and how to execute the survey. For example, if you are trying to understand what women need in low bandwidth contexts, voice or SMS methods are more likely to be useful than online surveys.
  2. Identify a platform that works for your context: With growing digital service providers, the good news is you have options, but not all of them are a good fit. CARE partnered with Viamo to use interactive voice response in some places, and SMS surveys in others. CARE also developed two innovative digital tools for qualitative data collection called Fatima and Voice App, which allow researchers to conduct, tag, and analyze key informant interviews over the phone and in-person (respectively) in order to more efficiently assess community needs and adapt programs more quickly.  These can work in high- and low-connectivity areas and are free for the respondents. When you are working with women, you need to put them at the center – keep asking if this works for them. Do they own a phone, can they easily follow the instructions, do they have the safety and space to respond? There is no perfect solution for this; you need to be aware of your context and your survey target population when choosing a platform.
  3. Keep your survey short and pretest your survey: Women take time from their busy day to respond to your survey; make sure the survey is not another burden. Unlike a face-to-face survey, it is easier for respondents to hang up or drop off from a digital and phone survey. It is important to keep your survey under 15 minutes and between 7 to 10 questions. If your survey is multiple-choice, keep your list of choices limited. From our experience, we have learned to keep our list of options precise, and we try to keep it between six to eight options. Don’t just develop a survey and run with it – test it with selected colleagues and community members and get feedback. Is it clear, it is too long, how can we improve it, etc.? You might be in a hurry to launch your survey, but pretesting—including translation—is not a task you would want to skip.
  4. Check your database: Often women don’t own their phones; what we have learned in our surveys is some savings group members registered the same phone number multiple times. The other critical issue is when a woman registered a phone number owned by her husband or ‘man of the household,’ work with your team and your community agents to double-check phone numbers. Ask respondents ahead of time if they have alternative numbers they want to use or set up a system to ensure you are speaking to the intended person at a convenient and safe time, such as setting up a code when scheduling.
  5. Train, sensitize, and check respondent preferences: Train your staff, partners, and data collectors. Women community leaders are critical in your sensitization process. They will reach out to groups and communities to explain the objective of the survey, share their experience and when to expect calls. In our survey in Nigeria, Burundi, Tanzania, Malawi, and Uganda, community agents were vital in informing women about the survey. In Uganda and Tanzania, our team called respondents ahead of the survey, explained the survey process and objective, and asked them if they want to receive a survey. In Burundi, a short SMS survey was pushed before the actual survey to ask if respondents want to receive a survey and participate in the survey.For Fatima, phone calls were made in advance to test phone numbers, explain the research, and (if they choose to participate) schedule a follow-up call for interviewing.
  6. Prioritize participants safety and consent: Participants' consent is a must; you need to ensure you provide an adequate explanation of the objective of the survey and explain their rights as participants. Participants can choose not to participate in the survey initially, but even for participants who consented at the beginning, the option to opt-out at any point of the survey must be ensured. Remote surveys are impossible to ensure the respondent's surroundings and safety, but you can consider a few things to make the situation safe. We give people a chance to hang up and call back free of charge and continue where they left off. At the beginning of all phone surveys, we asked if respondents are alone. We also avoid asking sensitive questions over digital or phone surveys. For instance, questions focusing on Gender-Based Violence (GBV) are not included in our digital and phone surveys. Also at the end of an interview using Fatima, respondents will also have the opportunity to reconfirm consent on all or select parts of what was shared. 
  7. You need time – be patient but also quick to adopt: If you have a short deadline, a phone survey might be more effective in terms of response rate than a digital survey. In our experience, respondents were more keen and likely to respond to a direct call from our staff, partners, and consultants than completing a digital survey. We received great response rates in Burundi, Tanzania, and Uganda, ranging from 30% to 60% because of detailed planning and intense sensitization. In our experience in Nigeria, where we had a 10% response rate to our digital survey, we switched to a phone survey, and we our response rate became 100%. To increase responsiveness to digital surveys, we often tried to push the survey multiple time and call respondents to prompt response. Respondents, particularly women, told us that receiving multiple messages per day doesn't encourage them to respond. Sending a survey at the right time is more effective than sending it often, so check their preferred time and willingness to receive a survey – this makes a difference in response rate.
  8. Report Back to the Women and Community: Once you have the data, don’t forget about the women. Share the learnings and findings with them to support their own actions. We have seen positive reactions from women who are part of the sharing process and lead the community dissemination. They have told us that the information enabled them to understand the COVID-19 context, and they are becoming leaders in their community and group by sharing learnings. Make sure your survey process is not extractive from the women and community – they want to be part of the process and know the learnings, so go back and share. This is also an opportunity for you and your team to hear more from the women and continue building trust, strengthening relationships and improving your survey process.

 

Conclusion:

The above learnings and tips do not cover all the issues one needs to consider in launching a digital survey that best targets and reaches out to women. As we continue to gather learnings, we will include additional tips.

 

 

What Can We Learn from Collaborative Redesigns? An Interview with the Cooperative Development Program on their Pilot Approach

Aug 18, 2021 by Emily Varga and Katherine Doyle Comments (0)
COMMUNITY CONTRIBUTION

In this interview with Emily Varga, Learning Lab learns more about a pilot approach from USAID’s Local, Faith and Transformative Partnerships (LFT) Hub that seeks to identify early learning from implementation to inform adaptive management.The pilot approach, called collaborative redesign, integrates the practice of pause and reflect and uniquely engages local partners as key participants who both contribute to and learn from the pause and reflect and redesign process.

 

This interview accompanies a resource, Collaborative Activity Redesign: Putting Collaborating, Learning, and Adapting (CLA) into Practice, which provides further background on lessons learned through the pilot collaborative redesign process, tips for undertaking this approach, and a sample agenda and timeline. The resource was produced by the (formerly known as) Office of Local Sustainability.

 

Emily Varga is the Acting Engagement, Evidence and Learning Team Lead and a Cooperative Business Specialist in the LFT Hub within the Bureau for Development, Democracy and Innovation (DDI). Additionally, Emily serves as the Agreement Officer Representative (AOR) for USAID’s Cooperative Development Program

 

--------

 

Katherine Doyle, Learning Lab manager: Tell me a little bit about the Cooperative Development Program.

 

Emily Varga: The Cooperative Development Program, or CDP, is a global effort to build and strengthen community-owned and operated businesses - called cooperatives and credit unions. Activities address challenges to governance, member equity and capitalization, financial management, market performance, legal and regulatory reform, gender inclusion, and youth engagement. The CDP also emphasizes research, learning and piloting innovative approaches to development. The program is currently operating in 17 countries, partnered with over 100 cooperatives and credit unions, in the sectors of agriculture, finance, health and energy.  

 

Katherine: I understand that the CDP piloted a new approach to adapt from early learning from activity implementation, called collaborative redesign. Can you describe that approach, why it was initiated, and how it differs from a mid-term evaluation?

 

Emily: We modeled our entire program through a CLA lens. First, we co-designed all of our activities with U.S. and local cooperative businesses. This enabled us to receive buy-in earlier on in the program cycle and set the tone for shared ownership of the program. Additionally, I believe it shifted the posture of our partnerships with local organizations to create greater equity in an otherwise challenging power dynamic. Following the co-design of activities, we wanted to build in an opportunity to ‘pause and reflect’ earlier on in implementation.  

 

We had noted in past five-year programs, mid-term evaluations were often conducted in the third year, resulting in findings and recommendations in the fourth year, and adaptations only able to be implemented in the fifth and final year. At that point, it was too late to make any significant shifts. Also, there was a need to validate initial assumptions in the activity work plan following start-up, learning from the baseline evaluation, and the signing of MOUs with local partners. So, 1.5 years into our five-year awards, we collaboratively redesigned activities with our nine implementing partners, which included country-level work planning and focus groups with our local cooperatives and credit unions.

 

Katherine: The collaborative redesign process that CDP undertook from February to September in 2020 entailed pausing and reflecting. Where did this take place? What are examples of some of the activities that were assessed? Why was this set of activities a particularly good fit to pilot the collaborative redesign process?

 

Emily: The project was primed well to conduct a collaborative co-redesign process. Since all our activities were originally co-designed with relevant stakeholders, during our start-up workshop we asked all partners to respond to a series of forward-looking questions that we used to capture in 1-Year Look Back documents with the intention of adapting activities. The first phase of the collaborative redesign process actually occurred virtually. This way, we were able to more easily bring partners from different time zones into one “room.” Through nine two-hour sessions, we redesigned activities in 15 countries. We asked the same questions to all the organizations: a) what did you learn from the baseline and gender assessments; b) what did you learn from your work planning process; c) what have you learned from Mission engagement; d) what have you learned from local stakeholder engagement; e) have you identified areas for improvement in your original problem statement, in your geographic focus, in your technical approach, in your targets or in your partner selection?  

 

The activities for which we conducted collaborative redesign were a great fit because they are under five-year assistance programming and are implemented in lockstep with the evolving needs of local cooperatives and credit unions. For example, one activity builds the capacity of credit unions in Eastern Africa to extend financial products and services to underserved small businesses. Another seeks to modernize processing facilities of agricultural cooperatives in South Asia so that their products can be food safe and ready for export. In both of these examples, partners saw the value of a pause and reflect to ensure that activities were on track with larger, long-term program objectives and to integrate what they had heard from local partners and stakeholders into their work plans, activity monitoring, evaluation and learning (MEL) plans, and other implementation processes. 

 

Katherine: How was pausing and reflecting structured throughout the collaborative redesign process?

 

Emily: We tried to bring together the right people at the right time, and each process looked different based on what the implementing and local partners needed. In the planning phase, we asked our partners who they wanted to be at the table. We asked them to review and reflect on their original Program Description and 1-Year Look Back documents. USAID was often in listening mode for the initial session and played more of a facilitative role in the subsequent sessions to inform adaptation. This process began with an informational session with our implementing partners and their technical staff (6-8 people) who then went on to validate that information with local cooperatives and credit unions (10-15 people). This culminated in a final virtual workshop with the implementing partners and their management teams (5-10 people) in which they finalized their Program Descriptions, revised MEL plans, targets and in some cases, realigned their budgets. 

 

With nine organizations, working in 17 countries, this was a lot to coordinate. As much as we wanted to have the right people in the room, we had to rely on our implementing partners to facilitate offline discussions with their local clients. But, because we have built a culture of learning and trust, we believe that these sessions were held with intentionality and integrity to shape efforts moving forward.  

 

Katherine: Why was it important to CDP to include local partners and other stakeholders in the pause and reflect and collaborative redesign processes? 

 

Emily: Unfortunately, proposals are often drafted in the United States by business development teams, far from the communities that our activities serve. There is a real need to validate our initial plans and assumptions with those closest to the program - whether that be with field staff or the local communities, and ensure that their perspectives (and realities) are meaningfully reflected in our programming. At the time, the CDP was housed in the Office of Local Sustainability and is now located in the Local, Faith, and Transformative Partnerships Hub. It is not just good development practice for USAID to take a facilitative role and create space for local partners, clients and end users to inform the agenda, this approach is aligned with the vision of our office. 

 

Katherine: Were there particularly challenging aspects of bringing many stakeholders ‘into the room’ during the pause and reflect sessions? Did the team make adaptations during the process to address these challenges? If so, what were they?

 

Emily: There were the typical technical difficulties with running virtual sessions, despite conducting dry runs. This required both patience and grace as we identified other means, such as cell phones and WhatsApp, to bring people into these critical discussions. 

 

Katherine: Do you think the pause and reflect approach itself that informed the overall collaborative redesign effort was successful? Did USAID and/or partners get ‘out’ of these sessions or events what they hoped to?

 

Emily: Following the collaborative redesign, we did an After Action Review with our implementing partners. We learned that they found the process to be extremely useful to revisit their original program descriptions and validate their approaches. All partners did make adjustments to their activities in one way or another. Some positive feedback we received from an implementing partner on the collaborative redesign process included: 

 

Our team highly valued the invitation to pause and reflect about our program description, work plans and objectives, resulting not so much in a pivot away from the original but rather a deepening of our commitment to restoring original elements that had gotten lost and some new ideas about how to achieve the original goals. The timing also worked well--late enough in the project so we had good information about what was working and not working, but early enough to be able to make some meaningful mid-course corrections. The extended timeline also allowed us to take time for reflection without having to churn out a revised budget and program description at the same time. One key element of the process was that we felt a lot of trust in the USAID team so we were able to share very transparently. The mutual respect and sense of shared goals allowed for a genuine exchange.

 

We also learned that we could have done a better job communicating expectations at the beginning of the process. Since all of our implementing and local partners were new to this process, many of them wanted to know exactly what it was going to look like. Some constructive feedback we received from an implementing partner on the collaborative redesign process included: 

 

It would have been helpful to have clearer expectations on PowerPoints and additional information on what exactly needed to be provided. The agenda seemed to be over-complicated and we also had a challenge with technology and ended up connecting via cell phones.

 

Katherine: Can you share an example of an applied outcome of the pause and reflect events? Was there a lesson or an outcome from the pause & reflect that was unexpected?

 

Emily: All nine cooperative agreements were modified to reflect the shifts determined in the co-redesign sessions. This meant revising the program descriptions and in some cases, the budgets, to better reflect the direction of the activities moving forward. 

 

Katherine: Is CDP planning to conduct additional collaborative redesign processes in the near future? If so, what are some of the lessons learned that the team hopes to apply?

 

Emily: On a bi-monthly basis, the CDP facilitates learning sessions, known as the CDP 30for30 webinar series, on different themes that are of interest to our implementing partners. Over the course of five years, the CDP will host 30, 30-minute sessions - half of those will be USAID learning and half will be partner learning. Some USAID topics have included: Mission engagement, gender integration, and subaward compliance. Some partner topics have included: cooperative governance, youth engagement and cooperative development research.  These sessions are meant to be intentional pause and reflect opportunities for our implementing partners so that the learning and dissemination aspects are well integrated throughout the life of the project. USAID’s Cooperative Development Program intends to apply this collaborative redesign approach in future activities and the DDI/LFT Hub often advocates for intentional pause and reflect opportunities in implementation of USAID assistance activities. 

 

Piloting the Locally Led Development Checklist as a Pause & Reflect Tool in Armenia

Aug 18, 2021 by Lusine Hakobyan, Liana Poghosyan, and Katherine Doyle Comments (0)
COMMUNITY CONTRIBUTION

In this interview, Learning Lab gets the inside scoop from Lusine Hakobyan at USAID/Armenia and Liana Poghosyan from local partner Prisma who, in September 2020, piloted a new pause and reflect tool developed by USAID’s Local, Faith, and Transformative Partnerships (LFT) Hub. The tool, the Locally Led Development Checklist, is “designed to help USAID Missions and partners consider and adopt locally led approaches at every stage of the development process.”

 

USAID/Armenia was interested in how local partners perceived themselves to be empowered throughout the implementation of a research activity to inform program design under the Mission’s Local Works program. In this activity, three local organizations conducted rapid assessments to measure local government transparency, responsiveness, and levels of citizen participation in local decision-making in ten Armenian communities. Two grassroot organizations carried out the data collection in the ten communities. Prisma, a local research organization, developed the methodology and analyzed the collected data.

 

To reflect on the depth and extent of local organizations’ engagement, the Mission engaged in a series of 1-1.5 hour pause and reflect sessions with each of the three local partners. The Locally Led Development Checklist was used to frame and learn from these sessions. Prisma was involved in the first session, which was facilitated by LFT Hub staff and attended by the USAID/Armenia COR and alternate COR.

 

Lusine Hakobyan is a Development Program Specialist with the Armenia Mission and the Civil Society Activities Lead, as well as the point of contact for the Mission’s Local Works Program. Liana Poghosyan is the Executive Director at Prisma, a local partner and research organization in Armenia. In the following conversation, they reflect on that first pause and reflect session.

 

------

 

Interviewer (Katherine Doyle, Learning Lab Manager): What were the benefits of using the Locally Led Development Checklist tool during the pause and reflect sessions?

 

Liana: The tool was effective for facilitating discussion on the extent of our engagement with USAID through the performance of the contract. We reflected on the process of developing and adapting the methodology, challenges but also advantages of engaging with the local grassroots organizations for data collection, etc.

 

Lusine: The tool triggers focused conversation on the processes, incentives, stakeholders, level of engagement of diverse stakeholders, and different levels of engagement of the actors involved in the process. It also provides some space and time to reflect on things that might have been done differently though the implementation. 

 

Interviewer: Were there any challenges that the team encountered in using the Checklist? Are there aspects of the pause and reflect series that you would have done differently?

 

Lusine: This is a new tool, and as such, there was a need to clarify some aspects of the questions, and allow some time for reflection. The dynamics of conversation were interesting - slower at the beginning and in-depth and extended at the end of the set time. All three meetings were held online due to COVID-19 restrictions in the country. In general, the tool sets a perfect framework for reflection and for revisiting some of the approaches. The duration of the sessions was adjusted after the first one-more time was allocated for the rest of the sessions.

 

Liana: We didn’t face any challenges - the facilitation was smooth and the discussion open. All parties were open to talk about challenges and learn from the process- this was a safe space for reflection and insights.

 

Interviewer: How was space created to be inclusive of the various voices “in the room” at the pause and reflect sessions? 

 

Lusine: The facilitators were asking the questions, clarifying those in case that was needed, and then were providing sufficient time for reflection and discussion in line with best facilitation practices. I was offering my perspective only after the partners, and this was generating more discussion. During the sessions with the grassroot organizations, I opened the meeting with an introduction and purpose of the session, setting the tone by saying that there are no “right or wrong” answers and that we treat this as a learning exercise and are interested in doing things differently by engaging them more. I facilitated the sessions, and following the Washington colleagues’ style, provided more time and space for the organizations to reflect and express themselves. 

 

Liana: It was a while ago, but I remember a very open and friendly discussion. We were free to express our views about the process and it felt good knowing that apart from delivering our professional services, we are also contributing to a learning process. 

 

Interviewer: We often hear that there are “power dynamics” at play when USAID staff and partner staff convene. Partner staff may feel less empowered to share their perspectives or opinions openly, or USAID staff may - intentionally or unintentionally - drive the conversation in a specific direction. Were power dynamics addressed to alleviate such potential challenges? If so, did this work well/not well?

 

Liana: There was absolutely no pressure on what to say and not to say before, during or after the sessions. It was actually empowering to know the opinion of a research company contracted for an assessment matters for organizational learning of an organization, like USAID.  

 

Lusine: Because the modality of working with the local partners has been very cooperative throughout the process - there was no pressure during the session either - we “agreed to disagree” (of course in staying in the frame of the contract provisions) and set the tone of working collaboratively during the development of the methodology and training. 

 

Interviewer: What key takeaways did using the Locally Led Development Checklist reveal to both USAID staff and partner staff?

 

Liana: The contract for methodology development and analysis for the Rapid Assessment in the communities was Prisma’s first experience working directly with USAID. It was interesting to locate our engagement on the Locally Led Development Spectrum and visualize how engaged and closely engaged we were during our work. This was essential considering the tough time we were operating in: first it was COVID-19 pandemic which forced us to redesign the assessment, then it was the war that caused delays and created barriers for phone interviews. But collaboratively we were able to find solutions to all challenges.

 

Lusine: Intentional and early engagement with the implementing partners takes time but yields better development outcomes: partners are aware of the local context with nuanced understanding of all processes, actors and relationships which helps USAID to plan accordingly but also adapt easily, as well as feel engaged and get the sense of ownership. 

 

Different implementing mechanisms allow for different levels of engagement, so this should be considered while planning and implementing locally led development activities.

 

Interviewer: What were the outcomes of this series of pause and reflect sessions? What lessons did the team learn and what adaptations, if any, were identified?

 

Lusine: The take away was that USAID had a pretty close engagement with all local organizations/stakeholders on all levels - depending on their role in the process. All lessons learned, including early engagement with the partners, co-creation opportunities, and the mechanisms used, will be incorporated in the upcoming activities of the Local Works programming in Armenia. We couldn’t change anything at that point as the activities were over. Another takeaway was that there have been partnerships formed regardless of the mechanism used (including purchase orders that may not seem to provide much space for collaboration). Local organizations and companies are more open for learning. When it’s time for the core Local Works activity, we will plan and budget for pause and reflect sessions to be able to incorporate the lessons learned within the implementation of the activity.

 

Liana: There is always some place and time for closer engagement and learning with USAID, regardless of the mechanism. 

 

Interviewer: How did the conditions of COVID-19 change how the pause and reflect sessions were conducted and/or how the LLD Checklist was used? There are a lot of challenges to doing collaborative work remotely, but were there any surprise benefits that came out of adapting to the situation?

 

Lusine: The session was conducted in September 2020, so all participants were used to working online, knew the “code of conduct” and had the experience of online events and tools at their disposal. So no surprises here. All these meetings might have been delayed if not for COVID restrictions and availability and skills to engage with partners in different countries online.

 

Liana: All participants have already mastered the shift to the online space, and in general, the online modus operandi provides an opportunity to reach out beyond the borders, as was the case for this pause and reflect session.

 

Interviewer: Would you recommend the Locally Led Development Checklist to colleagues at USAID and partner staff? Why or why not? Are there any tips you would suggest to others using it for the first time?

 

Liana: The reflection sessions are also important for partners and service providers. During the session I remember noting that, due to the specific arrangement that was made between the partners for Local Works in Armenia (particularly the work was divided between one profit-making specialized organization and two grass root NGOs), all sides enriched their experiences of partnerships and created linkages that could be utilized in the future as well. Soon after this session, there was another project opportunity and Prisma was invited to be part of the consortium by one of the data collection organizations we met through Local Works.

 

Lusine: Definitely, I would recommend the tool as a pause and reflect tool. These sessions were important and helpful. Very often we need to deliver and rush for the results and lack time and opportunities for stopping and reflecting on our path and approaches. Tips: be patient, depending on the scope of the activity, allow for some time for reflection. Do it sometimes at the midpoint of the projects so you will have time to adapt.


Interested in trying out the Locally Led Development Checklist? Find the Fact Sheet, Facilitation Guide, and Worksheet here!

Knowledge Lives in People: Using 'Connect and Reflect' to Learn From Each Other

Aug 9, 2021 by Laura Ahearn and Eric Keys, USAID Digital Strategy Team Comments (0)
COMMUNITY CONTRIBUTION

Effective knowledge sharing is critical to the successful functioning of a complex organization like USAID. In the Innovation, Technology, and Research (ITR) Hub (formerly the Global Development Lab) in the Development, Democracy, and Innovation (DDI) Bureau we strive to be innovative and agile, and like all of our Agency colleagues, we are highly motivated to achieve successful development outcomes. Innovation, agility, and evidence-driven change make it essential that knowledge is managed and shared in a timely, efficient way. 

In this essay we describe Connect & Reflect, an activity that has been both easy to organize and extremely helpful in ensuring that knowledge is shared across ITR’s Technology (T) Division as staff members embark on the implementation of USAID’s Digital Strategy. Launched in April 2020, the Strategy features 16 initiatives, including both inward- and outward-facing digital development efforts. Between 55-60 staff members work on implementing various aspects of the Digital Strategy, primarily based in ITR/T though not wholly; and they are linked to others in the Agency and beyond. All of these people possess technical, institutional, and programmatic knowledge that enables them to implement the Strategy -- and this knowledge, when shared, can also be very useful for others’ work as well.

In essence, useful knowledge lives inside the heads of the staff implementing the Digital Strategy, posing a serious knowledge management challenge.

Recognizing this and other challenges, the ITR/T leadership invested in a developmental evaluation (DE) through the Developmental Evaluation Pilot Activity (DEPA-MERL) mechanism to support collaborating, learning, and adapting (CLA) as the Digital Strategy is implemented. To improve knowledge sharing, the DE instituted weekly 30-minute Connect & Reflect sessions for Digital Strategy initiative leads, staff, and other interested parties. (A list of similar approaches can be found on Learning Lab.) The assumption underlying these sessions is that USAID is an organization in which knowledge is transmitted through social networks. Even in (or especially in) a remote context it is important to recognize the social nature of information sharing. The Connect & Reflect sessions provide a venue for Strategy actors to come together to share knowledge and identify collaborative opportunities.

Central to the success of these sessions has been (1) their light lift (no preparation required for participants, and very little required for session leaders); (2) their focus on interactivity through brainstorming activities on virtual whiteboards such as Jamboard, Miro, and Mural; and (3) the short (30-minute) but regular cadence of the sessions. Each week, a Digital Strategy lead chooses the session’s topic and objective and facilitates a discussion with peers. Connect & Reflect sessions have included status updates, peer assists, Ask Me Anything segments, and collective tasks, but as varied as they have been, most have used the same format:

  • The developmental evaluator introduces the leader for each session and invites everyone to stay on afterward if they would like to participate in a mini-After Action Review (mini-AAR) on the session (1-2 minutes).

  • The leader of the session presents an issue, topic, update, or challenge for the group to respond to (5-8 minutes).

  • The group uses Jamboard, Miro, or some other virtual whiteboard to answer the facilitators questions or provide the requested feedback silently (10 minutes).

  • The facilitator reconvenes the group to respond to some of the written comments and arrive at next steps (6-8 minutes).

  • The developmental evaluator closes the session and previews the next week’s topic (1-2 minutes).

  • The session leader, facilitator, and any other participants who are interested meet to confirm the next steps and conduct a mini-AAR.

 

While the Connect & Reflect sessions occupy only 30 minutes of individual staff time, they are effort multipliers. Typically between 20 and 30 staff attend the sessions, resulting in 10 to 15 total hours of effort put into sharing information, solving problems, and socializing work. The Connect & Reflect sessions are thus intense spaces for knowledge sharing and collaboration. Taking place on Friday mornings, before most staff have begun to carry out their more intensive work for the day, Connect & Reflect sessions uncover opportunities to collaborate and draw on the brain power of everyone assembled. 

These sessions have been well received and have resulted in crowd-sourced, actionable advice and input. For example:

  • The session on the new Digital Technology and Science & Innovation Key Issue Narratives resulted in an analysis of the narratives that generated insights useful to many ITR/T staff and even to leadership beyond the Agency.

  • The session on plans to constitute a Youth Digital Leadership Council resulted in valuable input from attendees on the content of the Council sessions and volunteers from across the T Division to co-facilitate them.

  • The session on the review process for new Digital Development training resulted in a revamped communication plan to keep Digital Strategy staff updated on the progress of different modules and expectations for timely feedback.

     

Knowledge sharing is a challenge across USAID. With very little time or effort, Connect & Reflect enables the Technology Division to learn about one another’s work, collaborate more effectively, and draw on the technical expertise of others, thereby socializing knowledge across the Division and beyond. 

 

-----

 

Laura Ahearn is a Technical Director with Social Impact. She is currently conducting a developmental evaluation of the implementation of USAID's Digital Strategy. Previously, she was part of the LEARN contract and worked on the DRG Learning Agenda, the Program Cycle Learning Agenda, and the Self-Reliance Learning Agenda. She has a Ph.D. in Anthropology from the University of Michigan and was a tenured faculty member at Rutgers University.

 

Eric Keys is the Digital Strategy Monitoring, Evaluation, and Learning Specialist. Prior to joining USAID he carried out various strategic and evaluation duties at the Department of State, Department of Defense, the National Science Foundation and as an applied researcher in Africa and Latin America on agricultural, climate change, and natural resource management interactions. He holds a PhD, in Geography from Clark University and was a faculty member at the University of Florida and Arizona State University. 

Strengthening Knowledge Retention and Transfer During Staff Transitions

Aug 5, 2021 by Adrián Rivera-Reyes Comments (0)
COMMUNITY CONTRIBUTION

USAID’s workforce is highly mobile, with Foreign Service staff in particular transitioning fairly frequently from post to post. Yet the Agency lacks a systematic process for capturing and transferring knowledge from outgoing staff to incoming staff. Consequently, during staff transitions, knowledge of our programs and their histories, underlying rationales and relevant relationships are lost, often leading to systems being reinvented and decisions overturned, resulting in loss of programmatic momentum, disruption in key relationships, and confusion and anxiety among staff -- particularly Foreign Service Nationals, whose deep technical and contextual knowledge is often overlooked. 

 

In late 2019, USAID/South Sudan reached out to the new Agency Knowledge Management and Organizational Learning (KMOL) Team to collaborate on developing a pilot model to address this problem. Throughout 2020, the KMOL Team worked closely with USAID/South Sudan to co-create processes and tools for knowledge retention and transfer to ease staff transitions and retain the knowledge that is one of the Agency’s most valuable assets. The Knowledge Retention and Transfer model includes processes, tools, and implementation plans (organized by function and against the mission calendar) to support efficient and effective transfer of knowledge from outgoing staff to their successors, and from Foreign Service Nationals to newcomers. The model aims to aid USAID missions to institutionalize effective knowledge retention processes throughout the cycle of staff transition and in turn improve productivity and momentum, reduce stress, and provide incoming staff and new hires the tools and information they need to start contributing to Mission objectives right away. 

 

Following the co-creation process with the South Sudan mission, the KMOL team convened a Peer Network of nine USAID missions to beta test the model. Network members include: Armenia, Azerbaijan, Egypt, Ethiopia, Guatemala, Kenya and East Africa, South Sudan, Southern Africa Regional, and Vietnam. Members of this peer network are currently testing and refining the knowledge retention and transfer model and toolkit in their respective missions, sharing their learning with each other, and informing revisions that the KMOL Team will make to the overall model and toolkit.

 

Through supporting the Peer Network, the KMOL Team has learned that while each mission faces circumstances specific to its own context, the overarching issues across missions are very similar. Peer Network member missions participate in monthly peer learning events facilitated by the Agency KMOL Team and designed to help mission staff customize and apply the model in their contexts. The knowledge sharing that takes place within the network speeds iterative improvements in the model’s tools and processes, as missions share their experiences and ideas, and help each other troubleshoot and adapt.

 

The Peer Network will continue to meet and further test and refine the model until its conclusion in November 2021. Please stay tuned for more information on next steps in scaling this effort in 2022. 

--------

To connect with us, please email [email protected]

A New Study Highlights 10 Years of Developmental Evaluation at USAID

Aug 5, 2021 by Chris Thompson Comments (0)
COMMUNITY CONTRIBUTION

The spread of new monitoring, evaluation, and learning (MEL) approaches reflects the uptake process for research findings in general: it takes time, intention, and effort for knowledge to be disseminated, adopted, and used.

A new study by USAID/Indonesia’s Developmental Evaluation (DE) for USAID Jalin highlights the Agency’s own decade-long process of applying DE, a relatively new approach to utilization-focused evaluation.

This study found that, between Dr. Michael Quinn Patton publishing his landmark book Developmental Evaluation in 2010[1] and USAID including DE as a type of performance evaluation in its Operational Policy for the Program Cycle (ADS 201) in 2020, the Agency has conducted 14 DEs with a cumulative cost of approximately $8-10 million.[2]

While the study elaborates on these DEs’ duration, budget, and structure and general trends, four overarching themes are apparent:

1.     USAID is using DE to operationalize CLA. Because DE strengthens programs by enhancing adaptation, in part by embedding evaluators within projects, it can serve as a practical approach to implementing CLA. This is especially true for improving internal and external collaboration, expanding a technical evidence base, and informing decisions about programmatic adaptations.

This study suggests that both Missions and Washington-based Operating Units are turning to DE to operationalize these CLA concepts in a variety of contexts. In fact, the 14 USAID DEs since 2010 supported eight sectors. Moreover, seven DEs were country-specific, four were Washington-based, and three were global.

2.     USAID’s use of DE has increased over time. While USAID conducted two DEs between 2011 and 2015, it conducted 12 between 2016 and 2020. The study found that this increase coincides with a rising interest in this new MEL approach within the Agency, as evidenced by the commencement of DEPA-MERL in 2015 and other promotional efforts. It also reflects the uptake process for new ideas and practices in general as they diffuse and gain buy-in over time.

3.     DE design options exist to overcome common challenges. The study found that two challenges to USAID conducting DE were their perceived cost and overlap with other activities’ MEL. Yet, this need not be the case. The Innovation for Change DE was one of the longest and also one of the cheapest because it used a part-time embedded evaluator. Furthermore, during the DE for USAID Jalin, USAID, the Jalin project, and the DE divvied up MEL responsibilities to avoid duplicating efforts. These examples suggest that use of DE at USAID can grow if DEs are designed responsively to their objectives, contexts, stakeholders, and available resources.

4.     The evidence base on DE in USAID is growing. This study gathered data by surveying and interviewing 75 people at USAID and implementing partners and three listservs (two at USAID and Dr. Quinn Patton’s Blue Marble Network) with a total of 1,920 persons. It also reviewed resources at the USAID Developmental Exchange Clearinghouse and the American Evaluation Association. Taken altogether, these knowledgeable practitioners and accessible resources constitute a robust body of evidence from which to continue growing DE use at USAID.

Because DE is a utilization-focused approach to evaluation, USAID’s ongoing uptake is an important means for the Agency to meet its performance goals (PG), such as PG 4.1.1: Increase the Use of Evidence to Inform Decisions.

And, though this study found positive trends in USAID’s use of DE, room for growth exists. USAID commissions an average of 200 evaluations per year, totaling more than 1,100 evaluations since 2011. A tiny portion of which have been DEs, and this underscores the study’s greatest value: raising awareness of USAID’s own journey of adopting and applying a new MEL option that operationalizes CLA. Afterall, to get to where you want to go, you need to know where you are and where you came from.  

 

------

 

About the Author: Chris Thompson is Chief of Party of Social Impact’s DE for USAID Jalin. His current work focuses on informing decision-making processes to accelerate maternal and neonatal mortality reduction in Indonesia.

Acknowledgments: This study would not have been possible without the support of USAID/Indonesia; the hard work of USAID DEPA-MERL, especially Danielle de Garcia, Sierra Frischknecht, Dominick Margiotta, and Felipe Rangel; the involvement of Dr. Michael Quinn Patton, Charmagne Campbell Patton, and the Blue Marble Evaluation Group; and the information shared by all those who participated.



[1] While publishing Developmental Evaluation in 2010 represented DE’s public debut, evaluators had been discussing and implementing DE at least as early as Dr. Quinn Patton’s article “Developmental Evaluation” in the journal Evaluation Practices in 1994. Furthermore, it is very possible that evaluators at USAID acted developmentally before 2010 and 1994 even if they did not call their evaluations “DE.”

[2] This assessment identified an evaluation as a DE if: 1) the evaluator is embedded or closely connected with a program team; 2) the scope supports adaptive management and addresses complexity; and 3) the scope is not tied to summative evaluation questions. However, DE can be called adaptive evaluation, real-time evaluation, or emergent evaluation, and it is possible that this assessment may have omitted some DEs due to its limited time and resources. We welcome you sharing your DE or similar initiative by contacting [email protected].

Introducing Learning Lab's August Theme: Pause & Reflect!

Aug 2, 2021 by Learning Lab Team Comments (0)
COMMUNITY CONTRIBUTION

The second themed month of our Month and Learn series at Learning Lab is Pause and Reflect. This is an intriguing phrase, action-oriented and, as one interviewee in our new podcast on the topic suggests, “kind of intuitive.” Yet, it is also wide open to interpretation, flexible to include any number of actions.

 

Pause and reflect might be recognizable by other names, such as organizational stocktaking, reflective learning, big picture reflection, progress check, annual check-in, etc. In general terms, it can be thought of as intentionally taking time to stop and think about a practice, a process, an activity, or a dynamic and then apply the takeaways to adapt going forward. 

 

Whether a pause and reflect moment is a solo or group effort, how it might be informally or formally structured, how feedback is solicited and what decisions are reached are all adaptable variables that can be adjusted to fit your particular needs. We recommend that you take a listen to this new introductory podcast which provides a broad look, from many perspectives, at what pausing and reflecting means, why it is important, and tips for how to make it effective

 

 

As Fall approaches, we know we can look forward to portfolio reviews and midcourse stocktakings, end-of-fiscal-year reviews, perhaps even the integration of new team members - these are all perfect examples of opportunities for us to stop, think, and apply new knowledge to improve outcomes. At Learning Lab, we are looking at ways to incorporate more pause and reflect practices into our work. After significant launches or campaign wrap-ups, we host after action reviews to monitor our progress and advance our collective efforts the next time around. The rapid after action review template is now a go-to resource to help us structure our sessions and document feedback. 

To inspire our colleagues and partners to commit to pause and reflect practices in the coming months and all year round, we are sharing exciting new content and revisiting classics on Learning Lab. To start, we expanded the CLA Toolkit to include a brand new cluster on Pause and Reflect, which includes resources and tools related to the topic!

A roadmap of the rest of the month looks like this:

 

Week 1: Introduction to Pause & Reflect

Week 2: Pause & Reflect in Practice

Week 3: Inclusive Pause & Reflect

Week 4: Pause & Reflect Driving Change

 

Any exploration of what Pause and Reflect is in the USAID context should avoid prescription while simultaneously deepening our appreciation for what it can be. In this vein, we’re especially excited to collaborate with colleagues in USAID’s Local, Faith, and Transformative Partnerships (LFT) Hub in the Bureau for Development, Democracy and Innovation (DDI) during Week Three to showcase new content around inclusive pause and reflect and how local stakeholders’ and partners’ knowledge and feedback can be incorporated into all stages of development programming.

Check out the new CLA Toolkit cluster, our podcast, Fresh Perspectives: An Introduction to Pause and Reflect at USAID, and follow us on Twitter at @USAIDLearning as we highlight new and classic content throughout the month. As always, reach out to [email protected] to share your thoughts, ideas, resources and questions.

 

Piloting CLA Awards in The Philippines

Jul 26, 2021 by Martin Nañawa, Dr. Mary Ann Lansang, Reno Nalda Comments (0)
COMMUNITY CONTRIBUTION

Pilot builds on CLA best practices

Panagora Group holds the USAID CLAimHealth contract, which supports the agency to effectively implement its health portfolio in the Philippines through high-quality monitoring and evaluation data, continuous learning, and adaptive management.

Following four years of robust activities to raise awareness and build capacity on approaches to integrate CLA among health portfolio IPs in the Philippines, the CLAimHealth team supported the USAID/Philippines Office of Health (OH) in launching the Mission’s first CLA Awards competition. The competition both stimulated the capture of local approaches to improve development programming through collaborating, learning, and adapting, and served as a barometer of CLA internalization and the USAID/Philippines OH and the CLAimHealth team’s efforts to elevate CLA culture. Thus, the Philippines CLA Awards program was born. The team solicited and reviewed entries between October 2020 and January 2021, and a panel of judges ultimately selected finalists and an award winner in April 2021.

Award results show room for growth in CLA in the Philippines

There were seven entries in total, with one implementing partner (IP) submitting four of those entries. This gives the impression that a few partners have more confidence in their CLA efforts, and/or they have the absorptive capacity to develop entries on top of their daily remit for the Office of Health. The partner that submitted multiple entries had planned to resubmit these to the global CLA case competition–another indicator of strong commitment to CLA. This tells us that there are IPs who are either more advanced in their internalization of CLA or who are keen to establish a leadership position among the IP community in terms of CLA.

The USAID/Philippines OH and the CLAimHealth team expects to see more entries as we wind up for a repeat of this activity in the fall of 2021 as a means to reinforce institutional learning and overall project performance. Overall, the annual recurrence of the CLA Awards should contribute indirectly to improvement of USAID Health Project goals.

Top five tips for others trying something similar

While we are still learning from this exercise, we put together the following lessons learned for others to follow in the future:

  1. Secure buy-in from USAID. Since our team conducted this activity on USAID’s behalf, we found it important to have a shared vision with the AORs on the goals of the CLA case competition and how it supports their own objectives. Half the panel of judges came from USAID, while the other half from our roster of CLA Champions—a group of prominent health professionals we assembled as a resource for USAID CLA activities. Their prominence as academics or former health secretaries strengthens the profile of activities we host for USAID. We invited an additional judge from Panagora in order to represent the organization’s leadership role in CLA practice.
  2. Clearly communicate information. Our team prepared a comprehensive digital leaflet, which we disseminated via the OH’s Knowledge Library, a knowledge portal for USAID and IPs, and “e-cards” to remind partners of key dates.
  3. Help partners with their learning curve. Like us, many will be doing this for the first time. To prepare IPs for the competition and encourage participation, we invested in a learning session in December 2020, which provided resources and examples to build confidence among partners.
  4. Devise a way to track participation. All the entries for the CLA Awards were submitted on the final day before the deadline. As a result, we were unsure just how many (or if any) entries would be submitted until the final day of the contest. In hindsight, requiring registration to join the contest would have been a simple way of estimating returns and prepare for next steps.
  5. Develop a visibility package for the winner. Visibility and prestige are the main prizes for CLA case competitions participants. For these CLA Awards, our team worked with the Office of Health for commitments to publish CLA Awards content over their different visibility channels. Then, we produced the content for USAID and submitted the package for publication. One challenge to anticipate is that your activity may not be able to guarantee a publication schedule once materials have been handed over to the client, as even the Office of Health has to follow a content schedule from USAID’s digital and social media managers.

 Summary of learnings

The USAID/Philippines Office of Health and CLAimHealth team took a risk by replicating USAID’s CLA Case Competition locally. However, as a learning organization, we have also reflected on lessons we can carry forward into the next CLA Awards competition, such as identifying contingencies–in case of extremely low participation or need for improvements in the mechanics. We look to innovate the parameters of the CLA Awards next year to provide new challenges and to home in closer on USAID’s ultimate goal of moving the needle for Health Project investments.

Have you experimented with CLA or other case competitions in your Mission or bureau? What lessons are you carrying forward? 

Pages

Subscribe to RSS - blogs