Devastating vs Democratizing Data: How a global indicator can engage communities and be locally led
We are moving ever further into the realm of possibility afforded by advanced data analytics and experimental methodologies, such that answers to questions posed by wicked problems* are tantalizingly close. This is revolutionizing the development sector and helping us to realize our ethical obligation to use tools and strategies that are grounded in something other than intuition or highly personal experience. However, those impacted are not always included in the system that develops and utilizes data and evidence, which can contribute to a new form of inequality. [* A ‘wicked’ problem is one that is so complex that it has no single solution and cannot ever be fully resolved, as a resolution may lead to or reveal a new problem in another area.]
With advances in data access and analytics, we can answer increasingly complex questions. But that process left unregulated creates a power imbalance: Who gets to determine what questions we ask? Who has the resources to answer them? Who benefits from those answers? Who gets to define what is a rigorous answer? Who has the skills to use and understand it, and who does not? Those we serve rarely get to pick the questions, determine what is relevant, or have access to the advanced training necessary to make sense of it all.
Without an intentional approach, the system that prioritizes, creates, and uses data can contribute to inequality. Advances in data and evidence can be a source of discrimination, alienation, and marginalization, even though recent progress affords the opposite hope – that advances can be empowering and transformational. In short, data can be devastating as well as democratizing.
At Pact, we are taking stock of how we use data, how we define what is sufficient evidence, and who has power in determining all of this. Like our peers, we have for some time embraced participatory and empowering approaches in our evaluation practice. As we push ourselves to be ever more evidence-based, our practice is still evolving to put those we serve, to the extent possible, front and center around decisions about data and evidence.
What does this mean? It means taking a locally led approach to data and evidence, whenever possible. Our focus on supporting fully engaged communities means empowering those we serve in decisions around what to prioritize based on evidence they have access to, and how to measure success using criteria they define and methods they can use.
Take our capacity development approach as an example. Our approach puts partner organizations front and center in decisions around the design, implementation, analysis of and learning from a capacity building intervention. This approach is designed to engage partners in a self-assessment process, out of which they prioritize their own needs and set their own criteria of success. In this process, we support them to utilize data to understand their progress and learn from it. The evaluative process is itself part of the intervention, as is engaging with data in a manner that supports partners to prioritize what data are meaningful to them.
The richness of this multilayered intervention – using an evaluative approach in a capacity development intervention aimed at improving organizational performance in service of locally led development outcomes – is masked when one reduces improved performance to a single data point. And this is the flipside of the benefit afforded by ready access to large data sets in advanced analytics: We have become accustomed to seeing large numbers. The incentive is for large numbers – often simple reach indicators measuring breadth not depth – when we should be holding ourselves to a higher standard of greater engagement of those we serve in the prioritization, interpretation and use of outcomes data.
This is a challenge we are seeing in our own Global Indicators, now in our 11th year. Pact’s 14 Global Indicators are enterprise metrics that capture the organization’s work globally. As I have noted in a previous post, there is an ongoing tussle between the need for data at an enterprise level, and the need to ensure data are relevant to those impacted most by that data. At face value they are simple reach indicators, as they provide a quantitative measure of our scope and scale. But behind those indicators is often the story of an entire data and evidentiary lifecycle, not to mention the locally led commitment detailed above.
This year we are revising our Global Indicators. We are expanding them to better reflect those impacted by our work in the system that develops and utilizes data, even measuring the extent of community engagement itself in our projects. This is how Pact is making our strategy of being “data-driven” and “evidence based” also locally led, by ensuring the communities we serve are fully engaged.
I will leave you with a few final tips that we are embracing:
- Put decisions around what is important to measure and how to do so in the hands of the communities you serve, such as utilizing Outcome Mapping or other co-creation approaches to design monitoring, evaluation and learning (MEL) frameworks.
- Think of evaluation as an intervention in and of itself by building evaluative capacity so that communities have the resources they need to prioritize and themselves assess development projects that aim to serve them, such as by using participatory action research or other empowerment evaluation approaches.
- Be intentional about actionable learning processes in which communities are supported through utilization-focused learning agendas, and similarly, engage beneficiaries in adaptive management systems beyond simple feedback loops.
How have you worked to make data and evidence locally led? Please share your ideas!