Skip to main content
Community Contribution

Easy As 1-2-3: Three Collaboration, Learning, and Adaptation Tools for Your Next Conference

Jun 21, 2019
Rachel Yavinsky, Reshma Naik, and Carolyn Rodehau

How many conferences have you been to in your career so far? No matter the number, you know conferences can feel overwhelming—with hundreds or even thousands of attendees and a plethora of sessions, posters, side events, and meetings. Figuring out where to start, who to connect with, or what to do with the information you gather can feel like impossible tasks added to an ever-growing to-do list.

To overcome some of these typical conference challenges, the USAID-funded Breakthrough RESEARCH project developed and adapted three knowledge management (KM) tools that you may also find useful:

  1. conference capture form
  2. target audience survey
  3. after-action review

Our team piloted these tools at the 2018 International Social and Behavior Change Communication (SBCC) Summit in Nusa Dua, Indonesia. 


Let’s look at each tool, how it worked, and what we learned from the pilot.  



1.       Conference Capture Form


Graphic that lists key findings from 60 + conference capture form responses


Why did we create this? To maximize information capture across the entire project team and help inform the project’s strategy and future activities.


What is it? We used a Google form, accessible on laptops, tablets, and smartphones via an internet link. The form was designed to collect information from official sessions and informal meetings—or even individual “a-ha” moments. Staff and partners entered real-time information as they attended sessions or engaged in personal reflection. Because people typically already take notes in a conference setting, the burden felt minimal.


The conference capture form included eight optional sections: (1) Session name, (2) Key insights to remember, (3)  Evidence gaps, (4) Good ideas for future research questions or studies, (5) The “buzz”: frequently mentioned issues and concepts (6) Resources to check out, (7) People connections, and (8) Photo uploads.


What did we learn from the 60+ entries we received from ten staff members? Here are a few examples:

  • Key insights: “Evidence gaps seem to have outpaced evidence generation in SBC,” implying there is a clear need to map out and fill those gaps.
  • Ideas for future direction: The project can play an important role in helping to:

o   Establish clear and consistent terminology.

o   Define and develop a set of indicators to measure outcomes across programs.

o   Create guidance to improve routine data collection, reporting, and documentation.

o   Develop innovative research methods that can serve as a gold standard. 

  • The “buzz”: Trending topics such as human-centered design, prototyping, adaptation, scale up, sustainability, and complexity helped reinforce the project’s focus areas. 
  • Useful resources: Several resources relevant to the project’s technical work were identified. Some examples include:

o   A Gender Scales Compendium developed under C-Change.

o   SBCC Implementation Kits from the Health Communication Capacity Collaborative.

o   Provider Behavior Change Toolkit developed by PSI.  


More details on conference capture form’s implementation and results can be found here.


2. Target Audience Survey


Graphic that lists key findings from the 117 responses from the target audience survey


Why did we create this? To understand the communication preferences of key target audiences, in order to strengthen the project’s knowledge-sharing approaches and contribute more broadly to the field of knowledge translation and research utilization.


What is it? With input from our sister project, Breakthrough ACTION, we developed a short, digital survey in Google forms to implement at the Summit. Breakthrough RESEARCH staff and consortium partners used their mobile phones, iPads, or laptops to collect survey responses from Summit attendees. This data collection used convenience sampling and primarily occurred in social spaces where groups of conference participants congregated, such as near organization booths, at lunch tables, and in seating areas in the entryway. The survey was also included in the SBCC Summit’s Daily Digest on the last day. 


In our three-minute survey we asked six questions: (1) What type of organization do you work with? (2) What is your region of residence? (3) How do you typically first get to know about new research, best practices, or programmatic tools? (4) When learning about new research or programmatic resources, what format would best help you understand the key information and its implications for your work? (5) Which communities of practice or resource repositories do you regularly visit or participate in for SBC information or discussion? and (6) Would you be interested in being part of a small “user test-group to provide input on the design and format of Breakthrough ACTION + RESEARCH products?


What did we learn from the 117 responses?

  • There was a notable preference for shorter, more visually appealing formats, for example, one-two-page fact sheets or data visualizations and graphs rather than long reports.
  • The top three ways that people first learn about new research or resources all involve some level of interactivity and provide an opportunity for discussion, for example, conversations with colleagues, conference presentations, and working groups or communities of practice.
  • People are getting information from virtual sources that we did not even include. About one-third of respondents said that the communities of practice they most frequented were “other” or not on our list. Write-ins included internal intranets and professional social networking sites such as LinkedIn. 
  • Continuing to implement this survey throughout the life of the project will increase the sample size and diversity of participants, allowing us to conduct meaningful analyses and develop target audience profiles that can be used more broadly across health and development projects.


More details on the target audience survey’s implementation and results can be found here.



3. After-Action Review


Graphic that lists key findings from the after-action review report


Why did we use this KM approach? To take stock of what worked well and what could be improved to inform Breakthrough RESEARCH’s future participation in global and regional convenings.


What is it? Following the conference, we asked Breakthrough RESEARCH staff and consortium partners who attended to answer four questions via email:

  1. What would you say worked especially well (that we should be sure to do again)?
  2. What didn’t work (and really shouldn’t be done again)?
  3. What improvements could be made to things that didn’t work well?
  4. What completely new ideas or suggestions do you have for next time we participate in a big conference like this?


What did we learn from the ten project team members who attended the conference?

  • Respondents valued the “one-stop shop” conference information packet we had prepared that included: General Summit information; schedules for the booth, project events, and team presentations; contact information for staff attending; and social media engagement guidance.
  • In the future, it would be helpful to create a “quick reference guide” for those working at the booth, which could include an elevator pitch, talking points, FAQ document, and a cheat sheet of activities and countries where we work.
  • Given the unique structure of having a sister project (Breakthrough ACTION), jointly staffing the booth provided a good opportunity for information sharing, and going forward, implementing all three tools with our ACTION colleagues will allow us to reach a wider range of audiences and expand our learnings.


Your Next Steps

Use the Breakthrough RESEARCH’s CLA tools to make gathering and using information from your next conference as easy as one-two-three. 1. conference capture form, 2. target audience survey, 3. after-action review.




At Breakthrough RESEARCH, we conduct social and behavior change (SBC) research and evaluation and promote evidence-based solutions to improve health and development programs around the world.