The quality of your Rubric will improve the quality of the output the AI produces, so it's important to get right.
Rubric Context
The context field is crucial for ensuring the AI provides accurate and relevant feedback. By clearly defining the context, you help the AI understand the specific goals of the assessment, improving both relevancy checks and the overall quality of results.
Example 1:
"This submission is an introductory written assignment focused on workplace health and safety. Learners are required to identify at least three common workplace hazards and explain basic steps to reduce risks. The response should use simple, clear language appropriate for someone new to the topic, with a focus on understanding rather than technical depth."
Example 2:
"This task evaluates the learner’s ability to develop a marketing strategy for a small business. The response should include a clear explanation of customer segmentation, a proposed budget allocation, and at least two promotional tactics suitable for a limited budget. Practical examples, such as a specific product or service, should be used to support the strategy, and the reasoning behind choices must be outlined."
Example 3:
"This is a critical analysis assignment exploring the ethical implications of artificial intelligence in education. Learners are expected to evaluate both the benefits and challenges of AI adoption, integrating relevant theoretical frameworks, such as utilitarianism or deontology. The response should reference at least three recent studies or articles to support arguments, with conclusions reflecting a balanced and well-reasoned perspective."
Rubric level descriptors
Rubric descriptors define the expectations and criteria for each level of performance within an assessment. These descriptors provide clear, detailed benchmarks that guide the AI in evaluating learner submissions, ensuring consistent and accurate marking.
When creating rubric descriptors:
- Be Specific: Clearly outline what is expected for each performance level. For example, instead of “Good understanding,” use “Demonstrates a clear understanding of key concepts with examples applied accurately to real-world scenarios.”
- Focus on Measurable Criteria: Include observable actions or outcomes, such as “provides at least three examples” or “explains concepts with minimal errors.”
- Ensure Progression: Differentiate between performance levels by gradually increasing complexity, depth of understanding, or quality of execution.
By crafting well-thought-out rubric descriptors, you enable the AI to deliver precise feedback that supports learner growth and aligns with the intended learning outcomes.
Rubric Data Summary
The below table describes the fields that make up the Rubric and defines any restrictions or rules on each one.
Category | Field name | Description | Field Length | Allowed content | Other constraints |
Rubric meta-data | Rubric Title | A name used to describe a particular rubric. This is not enforced to be unique. | 5 minimum & 50 maximum. | Letters, numbers, and spaces | Mandatory to create a Rubric. |
Rubric meta-data | Module Code | A sequence of alphanumeric characters allocated to identify Rubrics. | 1 minimum & 10 maximum. | Letters, numbers, hyphens, and underscores. | Mandatory to create a Rubric. |
Rubric meta-data |
Rubric Context
|
A field in the rubric builder where a user can provide further contextual information such as the activity description. Clear and accurate information here helps the AI provide targeted feedback. | 50 minimum & 500 maximum | Letters, numbers, punctuation, and spaces. | Mandatory to create a Rubric. |
Rubric meta-data | Status | These help determine where in the lifecycle of a Rubric is, and in turn impacts the functionality surrounding a Rubric. | N/A | Draft, Published and Archived. | Prepopulated as draft when a Rubric is created. |
Rubric Table | Performance Levels |
Descriptors used as column headers in each rubric. The defaults are Pass/Merit/Distinction that are pre-set in each new rubric. Each of these has a description per criterion.
|
1 minimum | Letters, numbers, and spaces | A Rubric must have at least 2 performance levels, and no more than 8. |
Rubric Table | Criterion |
Specific standard against which a learner’s submission is judged and ultimately marked.
|
1 minimum & 500 maximum | Letters and spaces only |
A Rubric must have at least 1 criterion, and no more than 10. First criterion is created as a placeholder. |
Rubric Table | Descriptors |
The descriptors tell the model, what needs to be done for that criterion to achieve that level.
|
1 minimum | Letters, numbers, and spaces | Where every Criterion and Performance level intersect, there must be a descriptor. |
Related to