Skip to main content
Community Contribution

Drowning in Data? A rubric may help answer your questions

Jan 09, 2024
Heather Britt

If you are designing or managing an evaluation or learning agenda, a rubric may help you get the answers you are looking for. A rubric is a guide for assessing or scoring performance by defining quality and value. Teachers use rubrics to grade papers and exams. Chances are that most of us have not thought of a rubric since the last time we turned in an essay assignment, but there are some good reasons to reconsider this handy assessment tool. 

Rubrics are particularly helpful for dealing with large amounts of data from multiple sources. The data may be qualitative, quantitative or a mix. The data may come from one source or many. Dealing with data diversity is the rubric’s special strength! 

Another reason that rubrics are handy is that they can provide clear, evidence-based answers to complicated questions. Often, evaluations and studies report multiple findings that decision makers must reflect on further to reach an actionable answer. In contrast, rubrics can produce a single answer to a question. And because rubrics describe how evidence has been used to make a judgement, the evaluation users can be confident in that answer.

How do they work? A rubric specifies one or more criteria that define a quality being judged. For each criterion performance levels describe what performance looks like in practice. The performance levels are ordered from lesser to greater (or vice versa) according to the degree that they manifest the criterion and they be assigned either a quantitative label (1, 2, 3) or qualitative label (such as poor, average, excellent). When using multiple criteria, a rubric can also specify how to combine scores to reach a single performance assessment. Rubrics come in many shapes and sizes—they are a very flexible tool.

In our context, ideally, the people who will use the evaluation findings take part in creating the rubric. When stakeholders collaborate to develop a rubric, they build a shared understanding of the evaluation’s purpose, the quality and performance being assessed, and how the findings will be used. This goes a long way to ensure that decision makers use the findings. 

When in the process do you develop a rubric? When developed prior to data collection, rubrics facilitate agreement on the type and amount of data that users will find credible. This can reduce the collection of unnecessary data and that is something we can all support!  When developed at the analysis stage, rubrics can streamline analysis by specifying what to look for when examining the evidence.

After data collection, evidence is reviewed to reach a judgement of performance. A judgement is reached by answering: Which performance level best matches the evidence? If the rubric contains multiple criteria, the data may first be sorted according to the criterion it supports. If the rubric was developed before data collection and informed data collection planning, this step is much easier. When an evaluation involves large amounts of data from multiple sources, the data must be compiled, synthesized and analyzed before assessing which level best represents the evaluand’s performance.

At the end of the evaluation or learning agenda inquiry, the report authors can confidently say, “In our judgement, this evaluand performed at this specific level on these specific criteria and here is the evidence to support our judgement.” 

The short paper, Rubrics for Evaluation and Learning, describes how to choose, develop and use the rubric that is right for your evaluation or learning agenda. 

About the authors
Heather Britt, Independent Evaluator
Heather Britt

Heather Britt is an independent evaluator specializing in uncertain, emergent, contested, and dynamic programming. She authored USAID’s 2013 Complexity-Aware Monitoring Discussion Note, co-authored briefs on Outcome Harvesting and Causal Link Monitoring. As AEA SETIG co-chair, she led a collaborative process to define systems-informed evaluation principles. She chairs AEA’s International Working Group.