How do Organizations Integrate Learning into their Daily Work?

Jul 10, 2018 by Piers Bocock, Stacey Young Comments (0)
COMMUNITY CONTRIBUTION

Episode 6 of Leaders in Learning focuses on how organizations are integrating intentional learning into their day-to-day work. The three leaders contributing to this episode are:

  • Karen Mokate, Chief of Knowledge Management at the InterAmerican Development Bank (IDB)
  • Clive Martlew, Lead for Leadership and Learning at the UK’s Department for International Development (DFID)
  • Alison Evans, Chief Commissioner for the UK’s Independent Commission for Aid Impact (ICAI)

The three themes that emerged during the conversation are:

  1. Approaches need to be systematic, intentional and resourced.
  2. Learning and KM approaches will only become sustainably integrated if there is a supportive culture. 
  3. Sustainability depends on clear and effective processes and tools.

As always, after listening to this episode we invite you to use the comments section below share your experience and examples.

And be sure to tune into the final episode of the series, in which we talk about the future of organizational learning in international development.

You can stream new episodes here on USAID Learning Lab or search for “USAID Learning Lab” wherever you listen to podcasts.

COMMENTS (0)

What is the Role of Formal and Informal Leadership in Organizational Learning?

Jul 3, 2018 by Piers Bocock, Stacey Young Comments (0)
COMMUNITY CONTRIBUTION

The focus of the fifth episode in the Leaders in Learning series is one of the most challenging questions that we routinely face: What is the role of formal and informal leadership in creating a learning organization? This episode builds on previous episodes in connecting components of leadership and effective learning organizations, a key connection that we see consistently.

The Leaders in Learning featured in this episode are:

  • Thom Sinclair, Gateway Academy Team Lead at the Consultative Group to Address the Poor (CGAP) housed at the World Bank (between Stacey and Piers in the photo on the right)
  • Rob Cartridge, Head of Global Knowledge at Practical Action
  • Chris Collison, a leading independent KM and OL consultant

The discussion themes that we discuss in this episode are:

  1. Formal leadership support
  2. Informal leadership support
  3. Characteristics of supportive learning leaders

Tune in to episode 6, when we discuss the integration of learning practices and approaches into day-to-day work processes, with Karen Mokate, Clive Martlew, and Alison Evans.

You can stream new episodes here on USAID Learning Lab or search for “USAID Learning Lab” wherever you listen to podcasts.

Special Announcement: New Multi-donor Partnership on Organizational Learning for Development

Jun 27, 2018 by Clive Martlew, Piers Bocock Comments (0)
COMMUNITY CONTRIBUTION

On Friday June 15 at the National Press Club, Darren Welch, Director of Strategy at the UK’s Department of International Development (DFID) and Susan Fine, USAID’s Assistant to the Administrator for Policy, Planning and Learning, announced the launch of a new multi-donor partnership on organizational learning for improved development impact.

The announcement was a part of USAID’s Moving the Needle event, which convened decision makers, thought leaders, donors and implementers around how to leverage systematic, intentional and resourced collaborating, learning and adapting to support the journey to self-reliance.

Following the announcement, a panel discussion, facilitated by Stacey Young, USAID’s Collaborating, Learning and Adapting Team Lead, featured high-level representatives from the founding members of this new partnership: USAID, DFID, The World Bank, the InterAmerican Development Bank, and UNICEF. Click here to watch the webcast of the panel.

As members determine next steps for the Learning Partnership, here are responses to some of the most frequently asked questions, and information about who to contact to get involved:

What is the purpose of this donor partnership, and how does it differ from other donor collaborations?

The Multi-Donor Partnership on Learning for Development Impact (the Learning Partnership) is envisioned as a global Community of Practice consisting of high-level decision-makers representing development funding organizations who see the value of intentional, systematic and resourced organizational learning efforts.  The partnership will focus on sharing experience, tools, approaches and challenges related to the connection between intentional organizational learning efforts and improved development results. It is designed to enable its members to learn more about how development funders (and their partners) apply evidence, share knowledge and experience, and apply systematic learning processes in order to increase aid impact and improve organizational efficiency and effectiveness.  The members seek to understand how organizations have used learning to address challenges in international development, how they embed a culture of learning in their own organizations and support each other in the integration of those processes.

What do you hope to achieve?

Through the Partnership, we hope to help member organizations identify effective Organizational Learning approaches to common development challenges, leveraging promising approaches from each other and sharing back about progress. Compiling and sharing challenges, resources, tools, and learning in an openly-accessible virtual platform will in itself be a service to the development sector. Through more intentional knowledge sharing about Organizational Learning  we also hope to lay the groundwork for more on-the-ground collaboration where donors are working in the same countries, regions and/or technical sectors. Similarly, we hope to have a more coordinated approach to learning from our partners about what works and what doesn’t. Improved sharing of data, knowledge, contextual monitoring and approaches will—we believe—lead to more efficient and coordinated development delivery, and ultimately improved capacity of our partner countries and those supporting them.  

Where did the idea come from?

In October and November 2017, DFID used its convening power to facilitate two multi-stakeholder workshops designed to highlight a variety of knowledge management and organizational learning initiatives being led by donors, implementing partners, and other stakeholders.  As a result, it became clear that many different organizations—donor agencies and partners/implementers/suppliers—are investing in strengthening organizational learning, knowledge management, organizational development and adaptive management as routes to more effective development assistance. No one agency or organization has fully integrated a comprehensive learning approach but some are attempting to do this. Many have expertise and tools in specific aspects of organizational learning that they are willing to share with their peers. Further, every participant organization recognized the immense value of coming together to share what is working and where they are struggling, to engage in “real talk,” and to serve a more unified effort to improve understanding of what works in development.  And—perhaps the tipping point for this group—we heard clearly that if the development donors don’t prioritize organizational learning then we cannot ever expect their partners to do it either.

Who are the current members?

The founding members include USAID and DFID (initial co-chairs), the World Bank, the InterAmerican Development Bank, and UNICEF.  Sida (Sweden) and GIZ (Germany) have also been involved in the early conversations. Membership of the Partnership is by invitation, but there is no intention to be exclusive. The main requirement for membership is a commitment to engaging with, and contributing to, the collective body of promising organizational learning practices. Development funders interested in learning more about the Partnership should contact the individuals identified at the end of this blog.

Are there Terms of Reference for the Partnership and will there be chance for Implementing Partners to weigh in?

The founding members, along with Sida and GIZ, are in the process of drafting clear and bounded Terms of Reference (TORs) to ensure that this initiative has a clear purpose, outputs and agreed-upon outcomes. The initial term of the group will be no more than 2 years, at which point the Partnership will assess its value and purpose. When the initial draft is complete (by September 2018), we will open up a comment period to secure input from a variety of stakeholders.  

Who is funding the Partnership?  

Currently there is no specific additional funding set aside to fund the Partnership.  It is expected that members will cover the costs of the time (and travel when necessary) to participate in calls and meetings, though we recognize that there may be additional costs for which we might need to collectively seek funding.  

What are your next steps?

Next steps for the Partnership include:

  • Finalize Draft Terms of Reference, along with a process for sharing them with various stakeholders.
  • Discuss and establish dates and a location for a September 2018 in-person meeting
  • Prior to the September meeting, identify and begin discussing key organizational and operational principles for consideration at the September gathering.
  • Identify priority focal issues for our collaborative work.

For more information, contact:

While the initial partners work on the Terms of Reference, they welcome your questions, suggestions, or thoughts in the comments section below.

Statement from USAID Administrator Mark Green on Collaborating, Learning and Adapting

Jun 19, 2018 by USAID Learning Lab Comments (0)
COMMUNITY CONTRIBUTION

USAID Administrator Mark Green was scheduled to deliver the keynote address at Moving the Needle 2018, but was unable to attend due to a scheduling conflict. Deputy Administrator David Moore spoke at the event in his stead. Below is a letter from the Administrator read by David Moore at the event (image: right).

Dear Colleagues and Partners:

Welcome to the third Moving the Needle Conference on Collaborating, Learning and Adapting, the U.S. Agency for International Development’s framework for more-effective development. These themes are central to how we become a more-effective learning organization, and how we will realize the vision of self-reliance.

As we embrace more-efficient ways to support developing countries on their journey to self-reliance, collaborating, learning and adapting will help ensure we are designing and implementing programs that respond to local priorities, generate local resources, and strengthen local actors. This, in turn, will make the results we and our partners achieve more durable and sustainable.

It is essential to listen more intently to what the recipients of our assistance say about their experiences and their priorities. Many of you are already doing this by building strong learning and evidence agendas into your programs, and by developing your organizations’ learning culture. However we approach collaborating, learning and adapting, we can all effectively use our knowledge to mobilize learning and use our innovative funding to raise local resources in our partner countries. Thank you for your participation in this important event.

Sincerely,
Mark Green
Administrator of the United States Agency for International Development

Learning About Learning

Jun 7, 2018 by Rebecca Flueckiger Comments (0)
COMMUNITY CONTRIBUTION

Rebecca Flueckiger is a MERLA Operations Research Specialist with RTI International.

RTI’s Monitoring, Evaluation, Research, Learning and Adapting (MERLA) Community of Practice hosted a panel of distinguished guests to discuss USAID's Collaborating, Learning, and Adapting (CLA) approach at an event titled “From learning to adapting: How do we get to learning, and where do we go from there?”  Participants included Heidi Reynolds (MEASURE Evaluation), Easter Dasmariñas (USAID/Philippines, LuzonHealth Project), Tara Sullivan (K4Health), and Stacey Young (USAID/PPL). The event brought together an audience of nearly 250 individuals representing over 40 organizations (both in-person and online) who contributed to a rich and engaging discussion on CLA. If you were unable to attend or wish to review the discussion, the full recording can be found here.    

Here are a few of our top takeaways from the event:  

BUILD IT INTO WHAT YOU’RE ALREADY DOING

It is essential to build CLA into the intervention in a thoughtful and rigorous manner at the beginning of implementation to achieve better development outcomes. When putting together M&E plans, program implementation plans and work plans, practitioners should infuse them with learning and build a learning agenda that links all elements together. Heidi Reynolds described the challenge of “building the ship as we sail” and advised us to not start from scratch but build on the work of those who came before us and think forward on how others can build upon our work. Stacy Young’s words particularly resonated: “CLA is not a hobby you do on the weekends; it has to be an integral part of day-to-day programming.”  
 
MOVE AWAY FROM LINEAR APPROACHES

We must shift away from traditional linear approaches toward adaptive management and systems approaches and remember that CLA is informed by AND informs adaptive management. Easter Dasmariñas explained how the USAID-supported LuzonHealth team became champions of CLA. While gathering and learning from their data, the LuzonHealth team realized that they were not doing intentional learning and adapting. They then changed gears and took a holistic systemic approach (MERLA). Through the application of MERLA, they augmented existing programmatic M&E with operations research and learning best practices and approaches. They further incorporated USAID’s CLA approaches and tools to ensure that they were not only synthesizing program learning, but also using it to inform programmatic adaptations, policy decisions, and communications and dissemination, both internally and externally. Easter explained: “Adaptive management requires openness and flexibility with local stakeholders. Having an open venue for exchange can help address issues of underperformance.”  

MEASURE YOUR WORK AND LINK IT TO VALUE

Rigorous measurement of program implementation is key to providing data for programmatic decision making. Equally important is rigorous measurement of how CLA approaches and tools lead to enhanced program implementation and adaptive management. We need good knowledge management practices, tools and metrics to enable us to quantify and qualify learning, information and evidence related to program implementation and CLA. However, many implementers struggle with how to incorporate knowledge management best practices into their work. Tara Sullivan shared tools - developed by K4Health – to make this measurement easier and more standardized. Tara started off her discussion with the following perspective on knowledge management and how it links to CLA: "Knowledge management is really that enabler that we need for our learning and adaptation work."  

KEEP THE FOCUS ON THE "C" IN CLA
 
While it is easy to take the C in CLA for granted, it is important to recognize the power and value of Collaborating to ensure that Learning can proceed to Adapting. Stacy Young advised us to start small and pilot our work, identify and engage with natural champions, start with low-hanging opportunities, build on early wins, encourage and enable partners to own wins, link with leadership to highlight the value of CLA, and be inclusive. We can have the best systems, research, and learning, but learning may not turn into adapting if we do not also have collaboration as the glue to ensure that stakeholders own the learning and are part of the decision making.   
 
Through the discussions and Q&A many other points emerged that deserve further exploration, including streamlining learning deliverables, addressing the complexity of learning agendas, implementing user-friendly systems and adaptive learning loops, and how to build and foster learning in the field. Stay tuned for more on this from the RTI MERLA Community of Practice (MERLA@rti.org), #RTILearns.

Ways to Integrate a CLA Approach in the Performance Management of a Project/Activity

May 18, 2018 by Motasim Billah Comments (0)

Motasim Billah is a Monitoring and Evaluation Specialist at USAID/Bangladesh.

Since the revision of ADS 201 that mandated us to include Collaborating, Learning and Adapting (CLA) in our work, the M&E team in Bangladesh started receiving growing demands from A/CORs and Implementing Partners to provide them a practical guide to integrate CLA approaches in the performance management or monitoring and evaluation of projects/activities. As the operationalization of CLA has been evolving for the Agency itself, there have been more opportunities for M&E practitioners like us to share reflections from our field experiences, which could ultimately contribute towards developing a comprehensive guide in this area. My desire to engage more deeply in this conversation became a reality when I secured the Program Cycle Fellowship with the Bureau of Policy, Planning and Learning! During my Fellowship, I was based in the office of Learning, Evaluation and Research where I intensively focused on CLA.

The Fellowship provided me opportunities to gain cutting-edge knowledge on CLA through my involvement in different CLA related processes such as the Program Cycle Learning Agenda, CLA Community of Practice, adaptive management workstream, and CLA Toolkit development. It also provided me access to a wide range of resources, including different Missions' experiences on CLA and experts' opinions on integrating CLA into monitoring and evaluation. My time with PPL helped me organize my thoughts on CLA and write these reflections. The write up will be divided into three sections that will shed light on ways to integrate CLA into performance management.

The first section will spell out CLA in the logic model development and indicators framing. The second section will show how to integrate CLA in the MEL plan, data quality assessment and evaluation. The final section will demonstrate using CLA in tracking critical assumptions/game changers, establishing feedback loops and Mission wide practices for instituting CLA in the performance management of projects/activities.

Integrating CLA When Developing Logic Models and Indicators

Robust Logic Model

The development of a robust logic model is critical to enable the performance management system of an intervention (that is, a project or an activity) to function and to capture performance in a complex environment. Constructing a robust logic model requires analyzing a development problem from different perspectives, identifying the root causes of the problem and its linkage with other contingent problems, and tailoring solutions suitable for a particular context.  Building a rigorous logic model requires designers to invest a significant amount of time and ensure active participation of stakeholders in the construction phase. In this respect, adopting a CLA approach is useful throughout the process of developing a logic model. 

At the onset, a project or activity should identify stakeholders who can provide substantial insights in crafting the logic model. Once stakeholders are identified, experts in CLA or design could facilitate a logic model workshop to surface the best knowledge, expertise and experience of stakeholders. A robust logic model may involve multidirectional causal pathways of solutions to a particular problem using a two staged process.

  • First, it identifies the core results where an intervention will be directly intervening based on resources and manageable interest.
  • Second, it uncovers other potential results that are also critical for the achievement and sustainability of its core results; where the project or activity will be leveraging from the presence of interventions of different development agencies, NGOs, private sector and governments.

USAID/FFP has been using the robust theory of change and logic model in its program in different countries which can be a useful guide for other USAID programs.

Designers can also use the collaboration mapping tool  (learn more here), developed by USAID/Rwanda and refined by PPL/LER, to unearth the additional actors operating in the targeted geographic areas. It can then rank the agencies and their respective interventions in terms of the benefits that the intervention can tease out and their effectiveness in achieving and sustaining our results. For example, in Bangladesh a USAID environment activity partnered with the Government, which allowed the activity to set up its district/sub-district level offices within the premises of Government Fisheries Agency. This substantially helped the activity reduce logistical costs and strengthened the partnership with the Government. Other examples include joint project development such as when USAID and DFID collaborated in a major NGO health service delivery project in Bangladesh. A designer can also do a beneficiary mapping exercise to reduce overlaps with other interventions in the same geographic region and thus maximize developmental gains for the target population. To document plans and efforts to develop partnerships, designers could include any collaboration map and stakeholders engagement strategy as an annex to a Project or Activity MEL plan.

Collaboration on Indicators 

The logic model workshop can be also used to extract a set of illustrative indicators to measure the result statements in the logic model.  The illustrative indicators will subsequently guide the development of intervention-specific indicators that would be documented in the Project/Activity MEL plans. Once an activity starts rolling out the Agreement/Contracting Officer's Representative (AOR/COR) could periodically (e.g., quarterly) hold indicator review meetings with Implementing Partners and other relevant stakeholders to assess the effectiveness of indicators in capturing performances and other factors influencing the activity. In this regard, data quality assessments conducted by both USAID and Implementing Partners can be good occasions to review indicators. At the Project level, Project Managers could organize similar indicator review meetings with AOR/CORs to learn about the status of indicators and their effectiveness. The participation of the Program Office in the project level indicators review meeting is critical as that would help later align the strategy-level indicators with projects as needed. If a project or activity needs to revise its indicators, it should be adequately reflected in the MEL plan. 

A CLA Approach in MEL Plans, DQAs, and Evaluations

Including a learning agenda in MEL plans

A project/activity MEL plan should devote a section on learning that would essentially include a learning agenda at the project/activity level. A learning agenda generally entails a set of prioritized questions addressing critical knowledge gaps. In terms of scope, the questions can ask about short, medium and long term issues that are critical for the achievement of results of an intervention. In this respect, a project-level learning agenda can guide activity-level learning questions, and a Mission-wide learning agenda can guide project-level learning questions. For example, in the recent times, the Senegal Mission has developed a learning plan as part of their Performance Management Plan (PMP) that can help projects and activities articulate learning questions in their respective contexts. The learning section should include the learning activities that would be employed to answer each learning question.  It should also include target audiences, learning products (dissemination tools) that will be used to share learning outcomes; roles and responsibilities of different actors, timeline, resources and next steps.

Data Quality Assessment

The periodic data quality assessment is an important reflection tool for USAID and implementing partners to learn about data quality, gaps in existing data collection processes, data storage and overall data management. A CLA approach can be very effective in conducting DQAs involving USAID, Implementing Partners (IPs), and the local NGOs who are often partners of IPs. Based on DQA findings, periodic (quarterly/bi-annual) reflection sessions could be organized at the activity level involving all sub-partners of IPs that would provide opportunities to take course correction measures while identifying data strength and areas of improvement. At the project level, a pause-and-reflect session on 'learning from DQAs' could be organized at the regular Implementing Partners' meetings. The session would help both USAID and IPs learn from each other's experiences in managing data in order to strengthen the Mission level performance information management system. In this regard, it would often be useful for the DQA section in the MEL plan to clearly describe 'the collaborative practices/activities' that would be undertaken to conduct DQAs and share the practices.

Evaluation

Evaluation is an effective tool for capturing systemic learning from the grassroots level. A collaborative approach involving the Program Office, Technical Offices, and relevant stakeholders in developing evaluation scopes of work can be instrumental in uncovering the most pressing issues in connection to implementation and management. In this regard, Project Managers and AOR/CORs should take a lead to consult with beneficiaries, implementing partners and relevant stakeholders in order to frame 'good evaluation questions.' While framing evaluation questions, it is helpful to explain how they relate to, or contribute to answering, at least one learning question on broader issues, for example, questions that test the development hypothesis or critical assumptions, or inquire about external factors such as local contexts or national/local level policies which might influence interventions. The Bangladesh Mission has recently started the practice of including learning questions in its evaluation scopes of work. The evaluation section in MEL plans could explicitly describe how evaluations, to be conducted in the life of a project or activity, contribute to answering learning questions.

The dissemination of evaluation findings should extend beyond out-briefs of the evaluation team and uploading the document to the Development Experience Clearinghouse (DEC). In this regard, innovative approaches can be followed to share the learning with pertinent stakeholders. At the Mission level, project-wide evaluation dissemination sessions can be organized to share learning with senior management and technical teams. The Program Office can facilitate this session in consultation with Project Managers or AOR/CORs and Technical Offices. This type of session would be another platform for project/activity level decision making, as important insights might come out of discussions which could be useful for existing and new projects/activities. 

Recommendation tracker of Evaluations: A collaborative approach should be in place between Program Office and Technical Offices to ensure that the recommendation tracker functions in an effective and timely manner.  The Program Office can nominate a staff member as a Point of Contact (POC) for a particular evaluation recommendation tracker to work closely with the AOR/CORs or Technical Offices to follow up on the actions suggested in the tracker and agreed by Technical Offices.

CLA Approach in Critical Assumptions, Feedback Loops, and Institutional Processes

Tracking Critical Assumptions/Risks/Game Changers

Many Project/Activity MEL plans could benefit from including a set of context indicators or complexity aware monitoring tools in order to ensure that the overall contextual landscape of the Project/Activity is monitored. This would help us track our critical assumptions and risks periodically, as well as capture any game changers that can have unintended consequences on outcomes. In this respect, Project Managers, AOR/CORs, and Implementing Partners can employ different tools, such as regular field visits, focus group discussions, before- or after-action reviews, and other pause and reflect methodologies to collect qualitative stories. Project Managers, AOR/CORs and Implementing Partners could organize grassroots-level stakeholder meetings with beneficiaries, teachers, local leaders, journalists, etc. (as relevant to the sector) at least quarterly to understand the changes of context. In 2016, CARE presented a participatory performance tracker at a conference organized by USAID that can guide the development of context specific community tools to gather contextual knowledge. The outcomes of these meetings and context monitoring related qualitative stories can be reflected in quarterly and annual reports. Moreover, at the activity level, AOR/CORs can also hold regular learning and sharing meetings with other donors with which the project or activity is collaborating. These learning meetings can potentially inform the status of ongoing collaboration including the challenges faced as well as opportunities to expand the existing collaboration. At the project level, the Mission can hold quarterly project learning meetings where Project Managers and AOR/CORs discuss issues related to performance, including theories of change, critical assumptions, and overall implementation and management. 

Establishing Feedback Loops: A Tool for Learning and Adaptive Management

Establishing strong feedback loops is important to capture systemic learning. It is helpful for Project and Activity MEL plans to explain how the feedback loops will be connected to overall performance management. In this regard, the feedback loops can be highlighted in any diagram of MEL activities and data collection flow charts that would demonstrate how they would continuously provide information that contributes to performance and data management. MEL experts can also set up digital feedback loops such as online platforms or manual feedback loops such as feedback after a training/intervention.  It can also be anonymous, such as setting up 'feedback boxes' in different hotspots in the field so that stakeholders can freely provide feedback. It is important to put mechanisms in place to ensure that relevant feedback flows to the decision makers at the Implementing Partners, AOR/CORs and Project Managers. In this connection, USAID Missions can learn from USAID/Uganda's feedback loop for real time adaptation.

CLA Practices in the Monitoring, Evaluations and Learning Mission Orders and MEL Working Group

Institutionalizing CLA practices in performance management requires reflecting them adequately in any Monitoring, Evaluation and Learning Mission Orders along with Project and Activity level MEL plans. The Mission Order would include overarching common principles on CLA practices that would in turn guide Project Managers and AOR/CORs to integrate CLA approaches into their Project and Activities and respective MEL plans. In this respect, a mission-wide working group on MEL can be formed, which can help implement the Mission Order and sustain good CLA practices in monitoring and evaluation. Currently, the Bangladesh Mission has a functional MEL working group comprised of M&E professionals from Technical Offices and the Program Office. The working group provides a platform for discussing M&E related issues. The working group plans to incorporate a strong 'L' component in its work through revising the existing M&E Mission Order and finding CLA Champions in the Mission.

Conclusions

Incorporating CLA practices into performance management is an evolving process. It is true that many of the recommendations provided in my three blog pieces might not work in all contexts, each of which might have realities requiring a set of different practices.  I hope these blog posts will stimulate further discussions in the area of CLA in performance management that will enable us learn from each other's experiences and apply the same in our respective contexts. 

There is no such thing as a dumb question!

May 3, 2018 by Guy Sharrock, Jenny Haddle, Dane Fredenburg Comments (0)
COMMUNITY CONTRIBUTION
According to Carl Sagan, in his 1997 book, The Demon-Haunted World: Science as a Candle in the Dark, there are naive questions, tedious questions, ill-phrased questions, questions put after inadequate self-criticism. But every question is a cry to understand the world. There is no such thing as a dumb question.”

In our earlier blog (see Adapting: Why Not Now, Just Do It!) we described how one multi-year Development Food Assistance Project entitled United in Building and Advancing Life Expectations (UBALE) was finding ways, with support from USAID/Food for Peace (USAID/FFP), to implement the notion of ‘adapting’. In conjunction with implementing partners, Save the Children, CARE and CADECOM, Catholic Relief Services (CRS) is aiming to deliver support to 250,000 households struggling to sustain their livelihoods in the most food-insecure region of Malawi.

Asking questions: a fundamental skill

Asking questions and seeking answers is vital for learning, accountability and high performance. It seems to us – through our work with UBALE on evaluative thinking – that asking thoughtful questions is a fundamental skill that is required by everyone engaged in CLA.

There are three elements that seem worthy of note (probably many more, but three will do for now!):

  1. Feeling safe enough to speak up and ask questions
  2. Developing and sustaining the habit of respectfully asking questions
  3. Ensuring there are processes to address questions

In this blog, we will address the first two elements; in a related blog, Adam Yahyaoui and Mona Lisa Bandawe will describe a process that UBALE has recently undertaken to refine and package some critical learning questions that will be advanced over the course of this year.

It’s okay to ask questions

Evaluative Thinking: Critical thinking applied in the context
of monitoring, evaluation, accountability and learning

As individuals, we sometimes feel that if we ask questions, our supervisor, colleagues and peers may consider us negative or intrusive or worse still, ignorant or incompetent. This stops us from flagging concerns about our program performance, or allowing ourselves to have a different opinion from the majority view. Let’s be frank: it’s just easier not to ‘rock the boat’.

This challenge is not confined to a specific project or program, country, region or culture, nor to any work setting, whether it be government, non-profit or private sector. It is not even a novel concern: according to Kofi Kisse Dompere, there is a traditional African thought suggesting that, “No one is without knowledge except he who asks no questions.”

So, too often the so-called enabling environment for those who wish to ask questions can feel disabling or, at the very least, not hugely supportive. In her excellent TEDx video, Professor Amy Edmondson, opens with three vignettes illustrating different scenarios when an individual’s desire not to want to look dumb overcame the need to ask a question. She suggests that this can matter because, “it robs us, and our colleagues, of small moments of learning.” She proposes three things that can help to build a ‘psychologically safe’ office climate:

  1. Frame the work as a learning opportunity, not merely an activity to be completed. In a complex setting, such as the one in which UBALE is intervening, there are a lot of interventions for which it is not possible to know in advance what will be the outcome, nor what will be any unintended consequences, good or bad, at least not with absolutely certainty. It is this uncertainty, and the systemic nature of the setting, that justifies those involved to see each activity as a learning event. In Edmondson’s words, this “creates the rationale for speaking up.”
  2. Admit to your own shortcomings, as you surely can’t have a monopoly on wisdom! You cannot know everything in advance, you will miss things, particularly when operating in a complex setting where there are so many moving parts. So, for the task or activity to be performed to a high standard, you need the help of your colleagues and partners. This “creates more safety for speaking up,” according to Edmondson.
  3. Encourage lots of questions by modeling this yourself, and encouraging others similarly. This makes it essential for staff to speak up.

Developing the ‘questions’ habit

While it is a critical element, ensuring that the working environment is ‘psychologically safe’ is, on its own, insufficient to achieve high-quality CLA. It is equally important that staff know how and when to ask questions in a respectful manner.

Let’s assume senior managers have ‘bought-in’ to the importance of psychological safety, and start asking lots of questions; their aim is to encourage their colleagues and subordinates to follow suit. But this may not come naturally, or easily to those whose behavior they are seeking to change. Among our evaluative thinking resources, we suggest types of questions that help you know when evaluative thinking and learning is happening:

  • Why are we assuming X? What evidence do we have?
  • What is the thinking behind the way we do Y? Why are we not achieving Y as expected?
  • Which stakeholders should we consult to get different perspectives on X? and so on.

In the early part of our capacity strengthening work with UBALE a good amount of time was spent on this topic, both question generation, and practice in asking them. It was apparent that some colleagues found it easier than others to acquire and apply the skill; however, with time and practice, UBALE staff demonstrated that everyone has the capacity to ask questions that contribute to improved project learning. Our implementation intention should be to make it a habit!

We are planning to trial a couple of ideas arising from our recent work with UBALE to instill a habit of question asking, especially in field staff:

  • Working with staff to develop portable ‘flash cards’, each containing a question that can unlock a new line of inquiry, and
  • Bringing greater intentionality and being more systematic through developing checklists or question prompt lists that will help staff avoid any unwitting blind spots as they develop the ‘asking questions’ habit

Three key CLA lessons

  • Asking questions implies organizational change. Things are different with CLA, or at least they should be. Adopting a CLA approach implies that an organization is committed to becoming a true learning organization in which processes for asking and discussing questions are embedded in all operations. This obliges the right kind of enabling conditions.
  • Asking questions is critical to CLA. The Nobel laureate physicist, Richard Feynman, wrote, “I would rather have questions that can't be answered than answers that can't be questioned.” If monitoring data appear to suggest some variance between expected versus actual achievements, it is important to ask why, and what are the implications for project activity. This necessitates asking questions to deepen understandings of what is happening, and an openness to adapting earlier thinking. This requires appropriate processes and tools.
  • Asking questions requires a certain kind of staff. CLA necessitates staff who are, in the words of David Garvin and Amy Edmondson, “tough-minded enough to brutally confront the facts; to talk directly about what works, and what doesn’t work. It’s about being straightforward.” This must be conducted in a way that respects other people and their perspectives. This requires new staff skills.

F

Learning & Adapting to Combat HIV/AIDS in Uganda

Apr 30, 2018 by Maribel Diaz Comments (2)
COMMUNITY CONTRIBUTION

This blog post has been cross-posted from Social Impact's blog. Maribel Diaz is a Technical Director with Social Impact.

The sciences are well known regarding guidelines and protocols for HIV/AIDS treatment. Governments and donors are active in addressing the epidemic through allocating resources and setting targets. Yet, there is still much to gain from pausing to learn from the data and from service providers and adapting to realities on the ground. There are qualitative ways to address real-time findings in data and adjust implementation.

USAID calls this approach Collaborating, Learning and Adapting (CLA). CLA facilitates cooperation among key players. It generates learning based on real-time data and collective solutions. With CLA, new ways to implement HIV/AIDS programming can be adapted, tested, and analyzed and potentially be scaled-up rapidly.

In Uganda, the Strategic Information and Technical Support (SITES) activity is supporting PEPFAR donors to examine data for analysis and make decisions based on the changing face of the HIV/AIDS epidemic. Social Impact leads the Collaborating, Learning and Adapting (CLA) team which facilitates learning events with PEPFAR implementing partners.

HIV/AIDS in Uganda

PEPFAR has an ambitious goal for addressing the HIV/AIDS pandemic in priority countries. Referred to as the 90-90-90 objectives, the aim is to have 90 percent of people living with HIV know their status, 90 percent of people who know their status are accessing treatment, and 90 percent of people on treatment have suppressed viral loads.

Recent data from Uganda showed a lag in moving towards these targets. There was a large disparity between newly tested positive patients and the number of those new patients on antiretroviral therapy (ART) treatment.

Learning from the data

To address this lag, we designed a problem-solving learning event in response to findings on the PEPFAR TX_NEW indicator. TX_NEW tracks the number of adults and children newly enrolled on ART. It is expected that the characteristics of new clients are recorded at the time they newly initiate life-long ART. But the data showed this wasn’t happening as expected.

SI structured this event using tools to initiate a discussion with stakeholders, particularly a large multi-service implementing partner, RHITES EC working within 11 districts in East Central (EC) Uganda.

We engaged stakeholders in an organizational development/adaptive management exercise to undertake a process of “Head, Heart, Hands.” The group used their heads to examine real-time data. They took the information to heart to create meaning and propose potential solutions. They created action plans to have in their hands to apply solutions and adapt processes.

Understanding root causes

We presented data showing the current challenges in the continuum of response to the HIV/AIDS epidemic. We led a root cause analysis exercise to help participants examine what was keeping new patients from starting on ART. They identified: poor counseling skills and inadequate psychosocial support for newly tested patients; poor incentives for health workers; outreach activities not targeting the right at-risk populations; inadequate counseling pre-post testing; health workers not taking lead in testing.

Identifying solutions

Discussion groups prioritized five leading drivers and proposed potential solutions. Participants identified short-term processes that can easily be adapted in addition to longer-term motivational benefits for staff. For example, including staff in health facility planning will result in an improved vision for service delivery and a collective sense of urgency to address problems. Additional training for village health teams and expert patients in counseling can help bridge the gap between testing and enrollment in ART. Other simple but important improvements led participants to realize that there are solutions within their locus of control.

Planning for the future

Finally, we led an action-planning exercise. The participants recorded proposed solutions to the five leading drivers and identified the responsibilities among specific Implementing Partners and District Health Staff and decision-makers to address them.

In addition to action-planning, participation in the learning event provided tools for IPs and District Health staff to take back to their respective organizations and service-delivery sites for future problem-solving based on data.

Using CLA is helping make a difference in the fight against HIV/AIDS in Uganda.

Can a Competition Create Space for Learning? Three Design Factors to Consider

Apr 20, 2018 by Frenki Kozeli Comments (1)
COMMUNITY CONTRIBUTION

three children

Development practitioners are often innovating, piloting, and problem-solving — but sometimes these initiatives have a hard time getting disseminated past the project annual report.  At Chemonics, the Economic Growth and Trade Practice and the Education and Youth Practice joined forces to kick off 2018 with the launch of our Market Systems and Youth Enterprise Development Innovation Contest, an endeavor designed to spark knowledge-sharing between our projects and give our staff an easy opportunity to learn from one another.

We asked our global workforce to share the models and methodologies they use for market systems development and youth enterprise development. The incentive? The opportunity to share their work with their peers and industry leaders through remote learning events, publications, and in-person attendance at leading development conferences. We used an off-the-shelf online contest platform which could be easily accessed in real time by our projects around the world and opened the submissions to evaluation by expert panels and peer voting. Over the six-week contest period we received entries from Europe and Eurasia, Latin America, the Middle East, Asia, Eastern Africa, and Southern Africa. The reach of the contest and the enthusiasm of our staff was invigorating to experience.

With the contest itself behind us, we are taking a moment to reflect on what made this initiative an effective learning event and what we, as designers, could share with colleagues looking to launch their own. Here are three factors to consider:

1. Leave Your Preconceptions at the Door

When harvesting knowledge from the field, try to leave your preconceptions behind and open the initiative up to as many participants as possible. We assumed our competitiveness projects would account for most of the submissions — an assumption that turned out to be incorrect. Opening the contest to our entire global workforce broadened the diversity of the projects represented. We heard from energy projects working in youth workforce development, peace and stability projects convening young entrepreneurs, and enabling environment projects taking a market systems approach to women’s economic empowerment.

2. Build In a Moment for Reflection

We wanted this contest to be an opportunity for our staff to take a “pause and reflect” moment, so we integrated learning into the design of the contest. We asked our projects to tell us about what was unexpected, what went wrong, how they adapted, and what their path to scale and sustainability would be moving forward. The result was that we weren’t hearing about success stories, but about process and methodologies and adaptation, a true reflection of project implementation in the dynamic environments where we work.

3. Don’t Stop the Momentum of Sharing

Our global workforce responded strongly to the opportunity to share their experiences with their colleagues around the world. To build on this momentum, we’ve sponsored learning events for our winning teams in the field to discuss the models featured in the contest and their adaptability to different contexts. We’re organizing webinars throughout the year so our winning teams can share and discuss their models with our global workforce, and, more importantly, so we can promote project-to-project learning and collaboration. And finally, we’ll be bringing representatives from our winning projects to Washington, D.C. to attend the SEEP Network Annual Conference and the Global Youth Economic Opportunities Summit, leading industry events for market systems and youth enterprise development, to enhance our learning and collaboration with the development industry at large.

By now, you might be asking yourself who these mysterious winners are. Stay tuned in the next few weeks as we share the winning market systems and youth enterprise development models from Uganda, Moldova, Pakistan, and Ukraine, featuring a range of creative solutions — from motorbikes to river barges and robotics to school buses.

This blog has been cross-posted from MarketLinks. Frenki Kozeli is a manager on Chemonics’ Economic Growth and Trade Practice.

 

Qualitative Visualization: Chart choosing and the design process

Mar 29, 2018 by Jennifer Lyons Comments (0)
COMMUNITY CONTRIBUTION

In order for data to be used for learning and adapting, the data itself needs to be easily accessible. Evaluators and researchers have been hungry for resources on how to effectively present qualitative data, so last year Evergreen Data launched a qualitative reporting series. And, we recently released an updated qualitative chart chooser. In this post, I’ll explain how to use this tool and share examples of how it can be used.

Chart Chooser 3.0

We built this tool to be relevant for all levels of qualitative data use. Whether you only collect qualitative data as an open-ended question attached to your quantitative survey, or you are doing full-blown qualitative research, this handout will hopefully provide you with some new visualization ideas. Along the top of the table, you have the option to quantify your qualitative data in the visual. In some cases, quantification can break down a bunch of qualitative findings into a simple yet effective visual like a heat map. On the other hand, when you quantify the data, you risk losing context and the personal nature of qualitative data.

Next on the chart chooser, it is broken down by what you want to include in your visual-just highlight a word/phrase or include a higher level of analysis. Along the left-hand side of the chooser, you can see another breakdown depending on the nature of your data; like whether it represents flow, comparison, hierarchy, etc. Last, all the chart and visuals (along with cute little illustrations) are suggested as options. You know your audience, your data, and your story, so use this chart chooser to pick the best visual to fit your context.

The design process:

Let’s put this chart chooser to use! Let’s say that you are working with the homelessness service community in your area. Using a mixed methods approach, you have collected data on the homelessness system including the causes and the continuum of services available in your community. You are writing a 10-page report on the findings, but you want to summarize the causes and continuum of services available in your area. You are not looking to quantify the data because you want to give specific program examples. After taking time to look at the chart chooser, you decide that there is a flow to the nature of your data, so using a flow diagram will be the best fit.

Road Map 1

Producing quality data visuals is not just about choosing the right chart, you need to layer the right chart with quality design technique. Let’s look at how a flow diagram would look without putting much effort into crafting a design that tells a story. This (right) took me about 5 minutes using PowerPoint smart art.

The problem is this visual doesn’t tell a compelling story about the journey of homelessness and the services offered at each point in the continuum. Let’s reframe this visual to better showcase the journey.

This (right) is getting better! I can start to see the story develop. This still needs some love like the intentional use of color, an effective title, and a more personalized touch.

Images of people and quotes are taken from a CBS News report (https://www.cbsnews.com/pictures/before-and-after-from-homeless-to-hopeful/) on the 100,000 Homes Campaign.

This (above) is starting to look like something to be proud of! It is a piece that can be shared separate from the 10-page report and it summarizes your community’s journey of homelessness. This was all made in PowerPoint using textboxes, lines, photos, and square shapes.

I can imagine that the references to the different programs could have embedded bookmark links to the section in the report where they describe and talk more about that program. If this were to be posted online, the programs could link to websites providing more information and resources on each of the programs. Pushing the idea of using color intentionally even further, color used in this diagram should be threaded throughout the entire report. This is one of my favorite techniques! It helps chunk up a long, mixed methods report into bite-sized pieces that the brain can better interpret.

To keep up-to-date on more qualitative reporting ideas, follow our qualitative blog thread.  For any questions or comments, you can reach me at jennifer@stephanieevergreen.com.

Pages

Subscribe to RSS - Learning Lab's blog