CLA Through Data Management Systems: A Tale of Two Projects
At FHI 360, we have been engaged in thinking about how to make our collaborating, learning and adapting (CLA) efforts intentional, systematic, and resourced. Many workshops have hinged around the question is our CLA as institutionalized as it could be? We’ve seen that if we repeatedly ask, “why not,” the topic of enabling conditions invariably surfaces as a contributing factor. Setting up the right precursors to CLA, then, should be central to any institutionalization strategy. Recently, we have been reflecting on the role that data systems play in this progression.
The USAID-funded Ma3an and LENS programs are two FHI 360 projects that garnered visibility for their approach to working with data. Both shared similar trajectories in the growth and management of their data platforms, and both realized first-hand that the process of building these systems, if well executed, could have knock-on benefits for CLA.
First, both projects gravitated to using data systems and tools that were directly modifiable by field teams, allowing these systems become a site of regular adaptive management. Project staff were able to add fields to data collection forms, create complex data validation rules on the fly, and de-duplicate records on a daily basis. This reduced dependence on home office staff or external contractors for small changes, allowing requests for outside help to focus on more specialized needs. This ability to drive day-to-day field changes to data systems is what allowed them to be nimble and adaptive—in short, not just by being fit-for-purpose, but also by being fit-for-capacity.
Second, both projects hired personnel with data management skills to manage these platforms, who in turn spurred internal collaboration. Though home office staff and outside vendors helped design pieces of the information ecosystem, it was these local staff that marked a real turning point in adoption. This is because these individuals championed data tools with other staff and customized data management tools regularly to meet their needs. For example, USAID-Ma3an staff created their own record de-duplication module within their data entry systems; USAID-LENS staff created a battery of unit tests that ran data validation audits daily. The proximity of those managing the system those using it meant that the internal collaboration was frictionless and frequent.
Third, project data systems needed to adapt to evolving requirements continuously in order to address to new learning questions emerging from implementation. USAID-LENS’ reporting requirements grew exponentially over time in response increased data collection, and its systems needed to scale accordingly. Initial quarterly reports only included a handful of data points from Excel files. That changed when the Jordan mission rolled out highly granular data-reporting platforms like DevResults and made geospatial reporting a requirement. At the same time, the leadership team increasingly saw the value of bespoke internal research, and more and more datasets were being generated through the project’s learning agenda. As a result, USAID-LENS pivoted to hosting all of its monitoring data on its knowledge management system. By its final year, USAID-LENS was tracking more than half a million records in its data platform, and reporting thousands of data points per quarter on individuals, businesses, financial transactions, and other indicators. Spreadsheet software would not have scaled to handle this volume data, making subsequent collaboration and learning from data impossible.
In contrast, USAID-Ma3an right-sized its data systems after a data quality assessment found that its systems exceeded the capacity of field staff and partners. By simplifying elements in the data pipeline and automating others, USAID-Ma3an tailored its system to focus on the essential needs of its users. This iterative fine-tuning allowed the program to respond to new learning about how stakeholders were and were not using the platform.
In short, these successes in enabling CLA can be traced to…
1. … decisions to hire field staff who would build in-house data systems and tools. This structure naturally stimulated internal collaboration and institutional memory: Because these embedded staff both used and created the platforms and tools, human-centered design was both organic and systematic. These individuals worked with field staff (e.g., M&E and technical teams). Because software development was close to the field, the know-how around the solution was firmly entrenched in the field office and user base.
2. … a preference for easy-to-use point-and-click solutions that matched local capacity and allowed teams to manage their data adaptively: Data management solutions were often tweaked with minimal use of code. This let users adapt data platforms to evolving needs, while relying on systems that could scale. This ultimately increased ownership over the tools and platforms.
3. … adaptation of data collection tools to capture microdata, which enabled continuous learning and improvement and M&E for learning: In contrast to data that exists only at some level of aggregation, data that sits at the level of the unit of measure is much richer, being easier to reorganize and re-analyze. This effectively allowed M&E teams to use this data for pattern discovery and short loop learning cycles. Those insights lead to course corrections and early detection of problems. But collection of microdata also comes at a cost: the price paid is greater time digitizing and ingesting information, and a heightened responsibility to manage personal data.
4. … supportive leadership teams that adequately resourced data systems. This not only translated into budget line-items for direct expenses, but also decisions to hire local staff to manage those platforms. Wherever possible, cost was reduced by relying on tools that were either free or freely available to FHI360 programs.
These lessons are not new. In fact, many of these ideas reaffirm digital development principles. But what is perhaps novel is the reflection surrounding how this data management approach enabled CLA. Our experience in implementing USAID-Ma3an and USAID-LENS underscores that a human-centered philosophy to information systems enables several CLA elements, including adaptive management, continuous learning and improvement, knowledge management, internal collaboration, and institutional memory.