Choosing Between Automatic and Manual Scoring for Rubrics
Overview
When creating Scoring Rubrics in Learn Amp, you have the option to choose between automatic and manual scoring. Each method determines how point values are applied to the different quality levels within your rubric.
This article outlines the key differences between the two options, with examples to help you decide which best suits your assessment style.
Functionality Breakdown
Automatic Scoring
With automatic scoring:
You define a minimum and maximum total score for the rubric.
The system evenly distributes the points across each quality level for each topic area.
The lowest score is assigned to the first column (e.g. Poor), and the highest to the last column (e.g. Excellent).
Example:
If you set the range from 2 to 20 points:
The first level will receive 2 points
The last level will receive 20 points
All levels in between are spaced evenly
This method ensures consistent scaling across all criteria and removes the need for manual input during setup.
Manual Scoring
With manual scoring:
You assign specific point values to each cell in the rubric grid.
You are not required to set a minimum or maximum total score.
Each quality level per topic can have its own custom score.
Example:
You might set:
First column: 1 point
Last column: 1 point
Or any other variation depending on what performance you want to reward
Manual scoring gives you full flexibility but requires more granular setup.
How Scores Are Calculated
Understanding how scores are calculated helps you design rubrics that accurately reflect learner performance.
Total Score Calculation
The total score for a learner's submission is calculated as follows:
For each topic area (row), the reviewer selects a quality level (column)
Each selected quality level has a point value
The total score = sum of all selected quality level points
Formula: Total Score = Σ (points per selected quality level)
Points Per Quality Level
In automatic scoring, points per quality level are determined by:
Points = (column position × base increment)Where the base increment distributes evenly between min and max score
In manual scoring, you explicitly set each column's point value.
Maximum Possible Score
The maximum score a learner can achieve is:
Max Score = highest column points × number of topic areas
Example: A rubric with 4 topic areas and a maximum of 5 points per column allows a maximum total score of 20 points.
Score Clamping
The system ensures scores stay within valid ranges:
Scores below the minimum are clamped to the minimum
Scores above the maximum are clamped to the maximum
This prevents data anomalies if rubric configuration changes after scoring begins.
When to Use Each
Scenario | Recommended Scoring Type |
|---|---|
You want consistent, evenly distributed scoring | Automatic |
You need full control over how each level is weighted | Manual |
Simpler setup for multiple topic areas | Automatic |
Complex assessments with variable scoring importance | Manual |
Certain criteria are more important than others | Manual |
Quick setup with predictable score ranges | Automatic |
Tip: Use manual scoring when certain criteria are more important than others and should carry more weight.
Practical Examples
Example 1: Automatic Scoring
Scenario: You're assessing presentation skills with 3 topic areas.
Setup:
Min score: 3
Max score: 15
Quality levels: Poor, Fair, Good, Excellent, Outstanding (5 levels)
Result:
Each column gets points: 3, 6, 9, 12, 15
For 3 topic areas, the possible total score range is 9–45 points
Points are evenly distributed
Example 2: Manual Scoring
Scenario: You're assessing a project where creativity matters more than formatting.
Setup:
Topic 1 (Creativity): columns worth 1, 3, 6, 10 points
Topic 2 (Content): columns worth 1, 2, 3, 5 points
Topic 3 (Formatting): columns worth 0, 1, 1, 2 points
Result:
Total possible score: 10 + 5 + 2 = 17 points
Creativity contributes up to 59% of the total score
Formatting contributes only up to 12%
FAQs
Q: Can I switch between automatic and manual after setup?
Not directly. You'll need to recreate the rubric or adjust scoring manually depending on your new preference.
Q: Do both types support pass/fail calculation?
Yes, both methods allow you to define thresholds for pass/fail status on the linked exercise.
Q: Is there a recommended default?
If you're unsure, start with automatic for simplicity. You can always refine later with manual scoring.
Q: How do I know what scores learners are getting?
Use the View Report feature on your Scoring Rubric to see average scores and score distribution.
Q: Can I see the calculated points before publishing?
Yes, use the Preview feature when creating your rubric to see how points will be distributed.
Troubleshooting
Issue | Solution |
|---|---|
Scores not appearing correctly | Review whether scoring type is set and values are defined for all levels. |
Uneven point distribution | This is expected for manual scoring—double-check cell entries. |
Can't adjust point boxes | You may be using automatic scoring—switch to manual for customisation. |
Total score seems wrong | Check that all topic areas have been scored and verify the calculation formula above. |
Pass/fail not triggering | Ensure the exercise has a pass score set and uses the "Pass Score" completion type. |
Related Articles
Creating a Scoring Rubric
Linking a Scoring Rubric to an Exercise
View Scoring Rubric Reports
Last Reviewed: 26/11/2025