How Climate Change Program Is Encouraging Adoption of Research Findings

Oct 22, 2014 by  Comments (0)
COMMUNITY CONTRIBUTION

Evidence-based results provide a firm platform on which to build new programs and initiatives. But what if no one is listening?

The adaptation of study results requires more than just solid science; it also requires people in the areas concerned to embrace the findings and make them their own. It’s a two-part process.

Climate Change Hazard and Vulnerability Mapping in Lupanga Village, Malawi. ARCC researchers found that they could increase acceptance of study findings and enhance its relevancy (salience) by engaging local people in the assessment process.

Climate Change Hazard and Vulnerability Mapping in Lupanga Village, Malawi. ARCC researchers found that they could increase acceptance of study findings and enhance its relevancy (salience) by engaging local people in the assessment process. 

USAID’s African and Latin American Resilience to Climate Change (ARCC)program designed the Malawi Climate Change Vulnerability Assessmentto gauge the impact of climate change on agriculture, fisheries, water, natural resources, and livelihoods in Malawi in order to analyze the extent to which government, communities, and households in that country were equipped to adapt to these changes. The program then used the assessment’s results to inform USAID's food security programming and climate change investment decisions.

The assessment was developed with three characteristics which contributed to adoption of its findings among people affected by climate change and institutions with the mandate to do something about it: credibility, salience, and legitimacy. These characteristics are described in the science-policy literature[1] as critical for translating science findings into policy and planning. In the context of the Malawi assessment, these characteristics are defined as follows:

  • Credibility refers to the perceived quality and adequacy of the evidence and findings presented in the assessment. To be fully credible, the evidence and findings must also be authoritative, believable, and trusted. 

    The Malawi assessment established credibility by compiling an evidence base for decision making using the best available data and information, by applying recognized analysis procedures, and by clearly communicating the limitations of the analysis. A recognized climate research organization[2] conducted the climate portion of the analysis, which lent authority to the results, further increasing its credibility.

  • Salience is defined as the perceived relevance and timeliness of the information provided and integration of contextual factors.

    The ARCC program achieved salience by fully embedding the assessment results in the local context. During the course of the assessment, the team carried out in-depth, participatory rural appraisals in nine representative villages in eight districts. These were supplemented by approximately 50 key informant interviews. These interactions greatly enhanced the team’s understanding of the local context, allowing a wider audience to accept the assessment results beyond just USAID. Once the assessment was completed, the team structured its findings in a way that directly addressed the most critical needs and released the results in a timely manner aligned with investment cycles.

  • Legitimacy is the value whereby assessment results are recognized and accepted as an accurate reflection of reality. But "reality" is colored by individual or group values, beliefs, and perspectives; they may also be colored by the perception of the transparency of the assessment process.

    To establish legitimacy, the team engaged stakeholders at critical points throughout the assessment—during its design and implementation, and when the team began assessing recommendations for adaptation options. The process was inclusive—it provided a voice to many actors—and it was transparent. The team also shared the findings with farmers and farmer associations, who validated the historical climate trend analyses with their own real-world experiences of adapting to climate change impacts that were already occurring. The results of the assessment informed a participatory options analysis that engaged decision makers and encouraged them to explore approaches to strengthen adaptive capacity and manage risk across communities and institutions. 

While many programs are involved with conducting research and generating new knowledge, translating the results of their efforts so that they are used by decision-makers is often overlooked. In the case of ARCC’s program, generating and presenting assessment results deemed to be credible, salient, and legitimate by decision makers was essential for improving their understanding of the potential impacts of climate change and enabling them to act effectively to address them. Making the entire process inclusive, and keeping lines of communication open, increases the chances that stakeholders and beneficiaries will embrace change. 


[1] For example, Carly N. Cook, Michael B. Mascia, Mark W. Schwartz, Hugh P. Possingham and Richard A. Fuller (2013) Achieving Conservation Science that Bridges the Knowledge–Action Boundary. Conservation Biology, 27(4): 669–678.

[2] The Climate System Analysis Group at the University of Cape Town.

CLA in Action articles are intended to paint a more detailed picture of what collaborating, learning, and adapting (CLA) looks like in practice. Unlike other disciplines, CLA is not a technical "fix;" it looks different in different contexts. This series will showcase examples of intentional collaboration, systematic learning, and resourced adaptation, some of which you may find applicable to your own work. The case studies, blogs, and resources represented in this series document the real-world experiences of development practitioners experimenting with these approaches for the benefit of sharing what's possible.

COMMENTS (0)