How To Deliver Randomised, Balanced Tests With Exam Blueprints & Loft 3.0

Share via:

The potential of LOFT 3.0 cannot be overstated. It can end our reliance on costly, inflexible test centres and let us deliver exams anytime, anywhere, without worrying about content theft or cheating. 

In our last blog, we went over what LOFT 3.0 is, and when combined with online proctoring, we’re perking up our ears for the death knell for cheating. With all of the buzz about students using AI to cheat, we’d imagine this is music to your ears (no pun intended).

So what goes into creating the exam blueprints that will churn out randomised exams? The first step is to create an exam blueprint that guides how many questions of each topic, learning objective, taxonomy, difficulty, etc. must go into each exam. The second step is to select Linear On the Fly Testing (LOFT) as the exam delivery option: This means that the system will randomly select questions that match those criteria. Let’s have a look at each step in more detail: 

How does LOFT 3.0 work?

1. Create an exam blueprint for Linear On the Fly Testing

What is an exam blueprint? 

An exam blueprint or test blueprint is a detailed outline of the content, format and structure of an exam It defines which topics will be covered, the number of questions per topic, the type of questions used, etc. In the context of e-assessment, the exam blueprint acts like a filter. For example, with the Cirrus blueprint function you can let the system automatically pull questions for an exam – depending on the criteria you select. 

A prerequisite for using exam blueprints is that each of your questions is tagged according to the criteria you wish to use. Which criteria you use is up to you, but the more comprehensively your questions are tagged, the more precise your Exam Blueprint will be. These are some of the criteria you can use to determine how questions are selected to create randomised exams:

Learning objectives 

Learning objectives describe what a test taker should be able to do after studying the content covered in an exam. They describe the specific knowledge and skills that the exam will assess and serve as a guide for test takers to understand what they need to know and be able to do to pass the exam.


Topics are the broad areas of content that will be covered in an exam, they are a way to classify Items in your blueprint. 


Learning taxonomies are frameworks used to categorise and describe the various ways in which people can learn and acquire new knowledge and skills. The most well-known taxonomies include Bloom’s Taxonomy and Bloom’s Revised Taxonomy, which outline six levels of cognitive skills, ranging from simple recall to complex problem-solving. The original hierarchy focuses on

  • Knowledge – Remembering or recalling information.
  • Comprehension – The ability to obtain meaning from information.
  • Application – The ability to use information.
  • Analysis – The ability to break information into parts to understand it better.
  • Synthesis – The ability to put materials together to create something new.
  • Evaluation – The ability to check, judge, and critique materials.

And the revised:

  • Remember – Using memory to recall facts and definitions.
  • Understand – Constructing meaning from information.
  • Apply – Using procedures to carry out a task.
  • Analyse – Breaking materials into parts to determine structures and relationships.
  • Evaluate – Making judgements based on checking against given criteria.
  • Create – Putting materials together to form a unique product.

Using taxonomies to create items, e-assessment platforms can ensure that the items are developmentally appropriate for the candidates. For example, items that are intended for easier exam questions, or lower-level learners would focus on the lower levels of the taxonomy (Remember, Understand). In contrast, items that are intended for more difficult portions of the exam would focus on the higher levels of the taxonomy (Analyse, Evaluate, Create). 

For instance, “choose 5 questions related to Remember, 4 questions related to Understand, 10 questions related to Analyse, 2 questions Evaluate”, and so on, until you end with a fully formed exam with degrees of difficulty. You could create an easier exam for lower level candidates with only Remember and Understand items, a high level exam with all Evaluate questions, or a mix of all levels.

Previous performance metrics (facility indexes)

LOFT 3.0 will track metrics on questions asked previously, it will keep an index of how many candidates answered this question correctly or incorrectly in the past (called a facility index) and use this to “score” questions.  The idea behind facility indexes is that the easier a question is to answer, the higher the facility index will be. This means that questions with high facility indexes are likely to be easier for students, while questions with low facility indexes may be more challenging.

When creating exam questions, it can be useful to use facility indexes as a guide. For example, suppose you are creating a test for a beginner-level course and you expect candidates to have a basic knowledge of the material. In that case, you may want to use questions with high facility indexes to ensure that the randomised exam is aligned with those learning objectives . On the other hand, if you are creating a test for an advanced candidate group, you may want to use questions with low facility indexes to challenge the students and assess their understanding of the material.

To understand how easy or difficult a question is called the p-value. The P value is an indication if the question is hard (a lot of candidates answered it incorrectly) or easy (a lot of candidates answered it correctly). This might for example indicate it was too easy. It could also mean that all candidates understood what was expected of them or taught to them.

Calculation of the P-value:

P = average score of all candidates / max score for that question for all question types

It’s important to note that the Facility Index is only one of the many factors for item selection. It may not always be true that a question with a high facility index is easy and a question with a low facility index is hard. Other factors such as cognitive level, relevance to the course, and discrimination power should also be considered.

Custom fields

Creating LOFT exams can be a great way to assess student learning and track progress over time, and one powerful tool that can help make this process more efficient is the use of custom fields. 

Custom fields allow you to add additional information to each test item, such as the topic or skill being assessed, the difficulty level, or the type of question. This information can then be used to automatically sort and organise the items, making it easy to create and administer exams on the fly, also allowing for greater flexibility and customisation in how assessments are created.

For example, if an exam is primarily administered through computer-based assessment but some candidates need to take the exam in written form, a custom field can be used to indicate which questions are suitable for the paper-based exam. If a question requires viewing a video, it cannot be included in the printed exam. This is handy only for those administering both online exams as well as paper based exams. Going even further than that, custom fields can be used to tag questions by difficulty, essentially creating facility indexes manually. 

Overall, custom fields are a powerful tool for creating and administering linear-on-the-fly testing exams. They allow you to add additional information to each test item, making it easy to sort and organise the items, and ultimately make the test creation process more efficient and effective.

Friend and enemy questions

“Friend” and “enemy” questions are terms used in the context of exam design and item analysis. Enemy questions are questions that should never be included in an exam together. For example, questions that have the answer for other questions in them:

  1. “Fleur buys an article for $1000 and pays 21% tax. She sells the same article for $1300 and charges 21% tax. Find the Value Added Tax paid by Fleur.”
  2. What does VAT stand for?

These would be tagged as enemy questions, as question 1 automatically answers question 2, therefore one or the other can be asked, but not both.

Friend questions are ones that should always be asked together, and never separately. For instance, two-parter questions such as filling out a balance sheet on an exam, where one balance sheet is shown and there are four questions pertaining to it. All of the questions must be grouped and asked together, but never separately, as on their own they wouldn’t make any sense.

Friend and enemy questions are just one more layer of organisation when it comes to item classification and one more step towards perfecting your Exam Blueprint. .

2. Select LOFT exam delivery

Once the exam blueprint is finished, it is a simple matter of selecting Linear On the Fly Test delivery in the exam delivery settings (for those e-assessment platforms that support it, like Cirrus). The result: Randomly generated exams each time, based on the blueprint criteria you selected in the previous step. 

How do I get started?

Decide if LOFT is for you

It is important to identify the goals and objectives of your exam process with all relevant stakeholders, and understand the options available to you. Is content theft a major issue for you? Do you need to give test-takers the option to sit exams anytime, anywhere? Do you have the resources to develop a large item bank? If you answered yes to these questions, LOFT will be a great option for you. If not, perhaps you may be better off sticking to fixed form testing for now. Developing your item bank for LOFT requires a significant upfront investment – though of course it will pay itself off in the long run with increased test security, flexibility and accessibility. 

Develop your item bank

The larger your item bank, the more randomised the exams and the lower the possibility of content theft and cheating. Future developments with AI could spell the end of manually creating item banks in e-assessment platforms, but for now, the first and most important step in starting the blueprint is having a large item repository. And, equally important, you must ensure that each item is correctly tagged with the criteria you wish to use for the exam blueprint (learning objective, taxonomy, etc.). 

Develop your exam blueprints

1. Have a clear-cut goal for the learning objectives

Creating a LOFT blueprint starts with defining the learning objective for each exam: For second year finance students, an exam testing ‘Intermediate Finance Knowledge’ is needed to move on.

2. Determine your topics

Within an intermediate finance knowledge exam, there are several topics within that exam that candidates will be expected to know. For example, within the ‘Intermediate Finance Knowledge’ blueprint, topics such as ‘Financial Reporting’, ‘Professional Ethics’, ‘Economics’, ‘Portfolio Management’, and ‘Financial Analysis’ will need to be included in the blueprint to make it a well-rounded exam.

3. Utilise taxonomies

The six taxonomies will then need to be applied to ensure that various cognitive abilities are being tested. For example, for the ‘Intermediate Finance Knowledge’ exam, for topic ‘Economics’ 5 ‘Remembering’ questions, 3 ‘Understanding’ questions, and 2 ‘Analysing’ questions need to be included in this blueprint. Repeat this for every topic until the exam blueprint is complete. You can also use other dropdowns to determine what kind of questions should be included, such as multiple choice questions, essays, matching, etc. as well as using custom fields to tag items however you see fit.

Linear on-the-fly testing has the potential to revolutionise the way exams are conducted. As technology continues to advance, it’s likely that we will see even more innovation in the field of linear on-the-fly testing, including the use of artificial intelligence and machine learning to further optimise the test-taking experience. Organisations that embrace this technology will be well-positioned to take advantage of its benefits and stay ahead of the curve in the field of assessment. 

Stay tuned for our webinar on the topic to learn the ins and outs of LOFT and how it can benefit your organisation? Follow us on LinkedIn or sign up to our newsletter below. 

Share via:
Picture of Cristina Gilbert
Cristina Gilbert
Copywriter and digital content enthusiast, Cristina is motivated by the fast-paced world of e-assessment and the opportunities online exams give students to thrive.
Would you like to receive Cirrus news directly in your inbox?
More posts in Better Assessments
Better Assessments

The Role of UX/UI in Enhancing E-Assessment 

Effective UX/UI design goes far beyond aesthetics; it shapes an intuitive, accessible, and engaging user experience that can transform how we approach e-assessments. Discover how Cirrus’ expert, Madina Suleymanova, uses her expertise to improve usability, reduce stress, and enhance educational outcomes.

Read More »
Better Assessments

Implementation Unravelled: Launching Your Exams

You’ve successfully migrated your content and integrated your new e-assessment platform. Now, it’s time for the exciting part—launching your first exams. With proper planning and support, your launch can be smooth and successful. This guide outlines key steps and best practices to ensure a seamless transition.

Read More »
Better Assessments

New Frontiers: How to Take Your Awarding Organisation Global

Awarding organisations considering international expansion face unique challenges but also significant opportunities for growth. This article outlines key strategies for successfully navigating cultural, regulatory, and geopolitical hurdles, ensuring sustainable growth in a global market.

Read More »

Curious about all things e-assessment?

As Cirrus looks to the future, we are excited to bring you the latest news, trends, and useful information about the industry.


Subscribe to the monthly Cirrus Examiner to join our ever-growing community of people passionate about the unbridled potential of EdTech.