Skip to main content
Community Contribution

Data analytics, evaluation and learning at JSI: One USAID contractor’s approach to strengthening the use of evidence in global health

Apr 10, 2015
Anne La Fond

JSI promotes continuous learning through strategic use of information for health programming. JSI’s Center for Health Information, Monitoring, and Evaluation (CHIME) is a resource and coordination hub for staff and projects across more than 60 countries where JSI works. The Center provides technical support, facilitates networking and learning opportunities, and promotes innovation related to health research, monitoring and evaluation, and health information systems (HIS) within JSI and to clients. USAID’s Learning Lab asked Anne LaFond, CHIME’s director, to tell us about the Center and JSI’s work in data analytics, evaluation, and learning.

What is the overall approach that JSI takes toward M&E and use of data in the health sector, and how might it be different than how other similar units function?

JSI works best at the nexus of research and practice, applying creative evaluation and research designs and rigorous analytics to understand and address operational challenges in public health programs. We are essentially an implementing organization that designs and strengthens health system solutions. Solid evidence is an essential tool for guiding this process.

As one of several centers at JSI, we support the design and implementation of innovative and rigorous monitoring and evaluation activities and share learning across the organization and externally. Our global technical leadership projects, such as USAID | DELIVER PROJECT, Strengthening Partnerships, Results and Innovations in Nutrition Globally (SPRING), and the Maternal, Child Survival Project (MCSP), all have their own monitoring, evaluation, and strategic information teams made up of staff from across the various partners; our bilateral programs have both headquarters-based M&E advisors and in-country M&E advisors.

What specific activities are you working on now? 

JSI is conducting data quality assessments for the Global Fund, Health System Strengthening evaluations for The Global Alliance for Vaccines and Immunization (GAVI), and through the MEASURE Evaluation project, focusing on HIS strengthening, GIS, and the Routine Health Information System Network (RHINO), among other things. As the Global Research Partner on the Innovations for Maternal, Newborn, and Child Health Initiative, we have developed evaluation strategies for four pilot projects in Kenya, Sierra Leone, and Ghana, and are introducing process documentation to test the pathways proposed in each project’s theory of change.

The process documentation provides real-time feedback to program managers and will augment the impact evaluation at the end to provide a full picture of the effectiveness of these innovative programming strategies or models. In addition, we are increasingly employing in-depth, case-based research methods - a comprehensive research approach that is appropriate for understanding complex programs where contextual, health system, program, and community factors come together. Within JSI, the Center bring together JSI Research and M&E staff through a listserv, newsletter, M&E networking group, brown bags, and international meetings. These are investments JSI consciously makes to improve collaborative learning and adaptation of program strategies across settings.

What is the value added of a more comprehensive M&E approach? For the company? For project teams? For beneficiaries?

It is important to collect and use evidence in a way that best supports achieving your goal: capacity development, coverage or quality improvement, equity in access, community engagement, or scale-up strategies. The focus of M&E is shifting from almost exclusive emphasis on accountability to a more balanced investment in using data at different stages for different but related purposes, strengthening design and execution of health interventions, as well as testing and comparing the effectiveness of different strategies.

For JSI and its partners, we see the need to temper bean counting and generate streamlined and focused data points for real-time learning. These actions can help program managers and health workers hone and scale implementation strategies. Combining rigorous evaluation with savvy and nimble use of data informs adaptation without losing the evidence that emerges from solid evaluation designs.