Marking and analysis
Save time and effort with our tools
For your candidates, the hard work is over but the process doesn’t stop there. Our marking tools make the next stage of the assessment cycle quick and efficient and reduces the chance of administrative errors. The data feeds directly into our reporting and analytics system giving you speedy access to the information you need to finalise your grades.
Marking and reporting
Our marking tools make the next stage of the assessment cycle quick and efficient and reduces the chance of administrative errors. The data feeds directly into our reporting and analytics system giving you speedy access to the information you need to finalise your grades.
Advanced human-marking workflow
Supporting multiple markers, rules for moderation, anonymous marking, powerful annotations and more.
Marking by item or candidate
Assign your markers by item or candidates to divide the workload.
Multiple assessors and moderation
Share the workload or quality assure the marking by adding moderation.
Annotate directly into the candidate submission, and share those annotations with your co-assessors.
Get an overview of your marking progress in one convenient overview.
Artificial Intelligence for automatic marking of long-form answers and essays
Results and Analytics
You can analyse how items, tests and learners are performing, and also track trends across departments, subjects and locations. Cirrus helps you to make sense of your data by providing a series of rich, intuitive, reports that keep you fully in the picture and make your data work for you.
See the achievement levels of your candidates in different topic areas.
Psychometric data analysis
Automatically gather psychometric performance data on your items and assessments, including p-value, Rit, Cronbach’s alpha and more. You can use this data to quality assure your content.
Sending results data to external systems is a breeze using our readily available APIs.
Rescoring and result review sessions
Candidate appeals? No worries. With rescoring, you can change the questions, remove questions or give full marks to candidates after results have been released
Case study: Technical University of Eindhoven (TU/e)
To go where no man has gone before
TU/e’s goal was to choose an e-assessment platform supporting question types for exact sciences like mathematics, physics, computer science, as well as more “regular” question types, including open question types. Moreover, TU/e needed the possibility to gather learning analytics from the assessment platform.
In the true spirit of partnership, Cirrus has worked with TU/e to facilitate all their needs.