Case Studies in Learning: TRAMS, USAID/E3 Bureau

Comments (0)
Author(s):
Jindra Cekan, Jim Bellis
Date Published:
December 17, 2012
Contribution:
Community Contribution

Background

The CKM (Communications & Knowledge Management) Team of the E3 Bureau designed and built the E3 Travel And Mission Support (ETRAMS) Internet application for the EGAT Bureau some seven years ago. Over the past four years, ETRAMS has been adopted by four other Washington bureaus: GH, DCHA/DRG, PPL, and BFS. Most recently, the Asia/ME Bureau has requested one of their own as well. 

Because each of the bureaus required modifications to the application to better fit their own way of doing business, each application is slightly different from the next. As a result, we have prefixed the name of the respective bureaus to the core name of the application. Hence, there is GTRAMS (GH), DTRAMS (DCHA/DRG), PTRAMS (PPL) and FSTRAMS (BFS). And in general, the application is referred to as a “TRAMS”. The E3/CKM Team estimates that approximately 90 – 95 percent of all trips by the E3 staff are logged thru ETRAMS. These figures are probably now being matched by the GH and DCHA/DRG bureaus. The PPL and BFS bureaus are just coming on line with their TRAMS, so it will be a while before they will come up to the same level of compliance. 

There is much potential learning to be gleaned from the TRAMS given they are such a rich resource of travel and Mission support data.

Overview of the tool

Even though there are variations between the TRAMS, there are more similarities than differences. 

All of the TRAMS collect and track the following kind of data:

  1. Identification of the traveler, office and bureau involved in the Mission support
  2. The destination of the Mission support as well as the length of time of the support
  3. If the travel is a Mission request or if it is initiated by the bureau, e.g. for conference or training
  4. The Agency initiatives that might be addressed by the travel
  5. The main purpose of the travel – ranging from activity design to monitoring and evaluation to presenting at a conference
  6. Technical areas involved – here is where the TRAMS differ the most because of the technical disciplines supported by the various bureaus
  7. On site support vs. virtual support. The latter is support offered by Washington from Washington – no travel involved
  8. Statement of Work for the activity involved
  9. Upon completion of the travel, a trip report indicating accomplishments as well as points of contact

Learning Opportunity – Analysis

While rich data on travel and Mission support has been collected through the TRAMS applications, there has been a dearth of data analysis. This has primarily been a factor of time and money to support such analysis, not a lack of interest. Each of the bureaus that use the TRAMS extracts quarterly and annual reports on travel as a roll-up of what countries have been supported and to what extent in terms of TDY days. It is also used to see which staffers have been traveling, to what extent and to where. Primary purposes of travel by office are also evaluated. However, what is missing is a deep, broad and systematic analysis of the data -- data mining in the truest sense.

Potentially:

  • Further beneficial drilling down into the data could include the content of the TDYs to the visited countries, whether they constituted (or could become) a learning plan for technical assistance to a mission across a set of topics, or whether there could be sectoral synergies if the TRAMS informed each other about complementary trainings that different sectors were doing in similar countries.
  • CLA (Collaborating, Learning, Adapting, currently alongside the Missions’ CDCS processes) could be done across Bureaus in Washington and with Missions on how they are adaptively learning from their TDYs to these countries.
  • While USAID/W is vast, there is no clear process that TRAMS now facilitates to assist tracking recurrent TDYs on certain issues, which would illuminate that guidance is needed on topic x in countries y and z. Sitting down with volunteer bureaus/ offices to test a ‘deep dive’ of the data for what can be learned to strengthen technical programming quality and learning would be the first step.
  • Gaps could be identified as well, e.g. analyzing trip reports to create a suggested standardized set of learning to be gleaned from the past and for the future for benefit to USAID/W and Missions.
  • The TRAMS purpose as a logistics-planning tool would be maintained but strongly strengthened into a learning tool as well.

What is needed:

  • Organization of a data analysis steering committee to oversee and lead the project, followed by three to four, two-hour data analysis planning sessions to structure the data mining.
  • One analyst for five-ten days a month for six months, with responsibility to present the analysis monthly to the team for iterative learning. Sharing data and opening it to discussion will enable the tool to be tweaked and the data used by senior management and team leads as well as Missions (as appropriate) in their learning and planning.
  • LER would organize one to two meetings to share interim and final learning with SILK and other learning supporters, including Mission staff.
  • Cooperation in the publication of a summary report with the LER Office and its technical support.

COMMENTS (0)