Designing for effective online assessment

Designing for effective online assessment

This guide will help you effectively design your online assessments within Blackboard by offering some practical and general guidance for writing objective test questions and feedback.

In addition to the information provided below, you should also refer to the Good Practice Guide for Blackboard Question Types which outlines the effective and fair use of the different question types available in Blackboard.

 

Question banks

Question banks allow you to create assessments where the questions are presented in a different order for each student, or the choice of test questions can be generated at random from one or more question banks so that no two students will be taking an identical test. Large question banks enable students to test themselves repeatedly on the same topic, but with variety in the questions delivered (Bull, et al. 2001).

However, randomly generated tests are more difficult to construct than a fixed test page because of the need for a larger bank of questions, but do have the advantage of being secure, flexible and scalable (Thelwall, 1999). Haladyna (2004) recommends that each question bank should have 250% of the questions needed for any one test.

Questions should be grouped into banks according to difficulty or type of skill being tested (eg recall, comprehension, analysis or application). Assessments can then be created that draw a certain number of questions from each bank ensuring that specific skills are being examined (Bull, et al. 2001). When creating a bank of questions, be aware that all questions in that bank must carry equal credit.

Effective question design

The terms used in the following guidelines refer to the four discrete elements of a multiple-choice question (MCQ):

  • Stem – the text of the question
  • Options – the choices provided after the stem
  • Key – the correct answer in the list of options
  • Distracters – the incorrect answers in the list of options

 

Follow these guidelines (after Pritchett, 1999) to ensure that your questions are valid and avoid containing unintended clues to the correct answer:

1. Construct each question to test an important learning outcome. Avoid testing for trivial details. 

2. The stem of the item should ask only one question. The task set out in the question should be so clear that it could be answered without looking at the answer options. 

3. Use simple and clear expression. 

4. Avoid repeating the wording of the question in the answer options as this increases reading time.

5. When the correct form of multiple-choice is used, make sure there is only one unquestionably correct answer. Check that any of the distracters could not be claimed as possible correct answers. 

6. Fill-in-the-Blank answers are evaluated based on letter and spelling patterns. When using such questions, keep answers limited to one or two words to avoid mismatched answers due to extra spaces or out of order answer terms, and provide answers that allow for common spelling errors, abbreviations and partial answers. Advise students where it is a requirement that answers to Fill-in-the-Blank questions must be spelled correctly or case sensitive.

7. When using the Calculated Numeric or Calculated Formula question types for calculation or estimation of a numerical response, you should specify a range of acceptable responses to allow for possible rounding errors in calculations and differences in decimal places which are not specified in the question item. Advise students where it is a requirement that answers to Calculated Numeric or Calculated Formula questions must be presented to a set number of decimal places or include units of measurement. 

8. Give the stem of the question in positive form and avoid negatives. A positively phrased question tends to measure more important learning outcomes than a negatively phrased one. Negative expression should be highlighted by the use of capitals or emboldening.

9. Make sure that all answer options are grammatically consistent with the stem of the question. Avoid the use of ‘a’ or ‘an’ at the end of the stem so that students are not prompted towards (or away from) options beginning with a vowel.

10. All answer options should be homogeneous in content. Inconsistencies can give a clue as to which is the correct answer.

11. Avoid other unintended clues in the wording of the question and answer options, such as: 

  • Stating the correct answer in stereotyped phraseology may cause the student to select it because it simply looks better.
  • Stating the correct answer in more detail than the other options may provide a clue.
  • Avoid the use of modifiers (e.g. may, usually, sometimes) as they often cause an answer to be chosen.
  • Avoid the use of absolutes (e.g. always, never, all) in the distracter items as they often enable students to reject them as such terms are often associated with false statements.
  • Avoid the use of the ‘all of the above’ statement. These will often enable the student to answer correctly based on partial information. If the student detects that two of the answers are correct, it is possible to deduce that the correct answer will indeed be ‘all of the above’. If the student detects an incorrect answer option, it is possible to reject the ‘all of the above’ option. Also as answer choices to Multiple Choice questions in Blackboard are presented randomly to students, it may not appear as expected as the last answer choice potentially confusing students.
  • Similarly avoid the use of the ‘none of the above’ statement. While the student may recognise a wrong answer there is no guarantee that they know what is correct. Again, it will be presented randomly to students in the list of answer choices to Multiple Choice questions in Blackboard.
  • Including two all-inclusive statements allows the student to reject the other answer options since one of the two must be correct. 
  • Including two answer options with the same meaning allows the student to reject both since obviously they cannot both be correct.

12. Make all the answer choices approximately the same length. If one response is longer than the others, this may give the clue that it is the correct answer. 

13. Make sure that each item is independent of others in the test. Items should not provide clues to the correct answer of other items, nor should correct answers in one item depend on correctly answering a previous one.

14. The layout of the item should be clear. If the question is an incomplete statement then the answer choices should begin with lowercase letters and end with a question mark.

The element of chance

With multiple-choice questions there is always the possibility of arriving at the correct answer by chance. In a two-choice (true/false) question, there is a one-in-two chance of arriving at the correct answer, and a one-in-four chance with a four-choice (multiple-choice) question. Despite this it is not generally advisable to have more than six answer choices (Pritchett, 1999):

  • When faced with the difficult task of finding many distracter items, there is always the temptation to fall back on answer items that are poorly phrased, clearly incorrect or even provide clues to the correct answer. 
  • Increasing the set of answer choices may confuse the student as it may be difficult to differentiate the possible answers, and it will increase the required reading time and decision time across the assessment.


Feedback

At the end of the test, feedback can be delivered for each question. Feedback can simply tell the student whether they are right or wrong. However it is much more effective to explain why a response is incorrect and to provide the correct answer for them (Charman, 1999):

  • Students like to get immediate feedback as it keeps the activity and result closely connected. 
  • It encourages students and keeps them focussed on their work giving them a clear indication of their progress and next targets.

 

The level of feedback provided to the student should be consistent and justifiable with pedagogical decisions and the aims of the assessment. When used for low-stakes, formative self-test assessments feedback should be designed to promote learning and learner development. The level of feedback chosen may be one or more of:

  • A simple score.
  • The score for each submitted answer.
  • Revealing the correct answers. 
  • Feedback on correct/incorrect responses with hints for further study and reference to learning material or information resources.

 

Recommendations on writing feedback (Charman, 1999):

  • Make it clear whether the response is correct or incorrect. 
  • Be constructive. 
  • Explain why an answer is not correct. Give the correct answer and explain how to derive it. 
  • Give feedback on correct responses. As some answers may have been arrived at by the wrong reasoning or by chance, feedback can reinforce this. 
  • Keep it simple. 
  • Give pointers to further learning opportunities and information.


Don't forget - Try it out

Before operational use you should verify that all assessments operate as intended. As well as checking the correct options have been selected or entered you should check spelling and that the text, graphics and multimedia elements display correctly. It is useful to enrol a colleague as a student to your Blackboard site to do this for you as part of the peer review of your site.


References

Bull, J. et al. (2001) Blueprint for Computer-Assisted Assessment. CAA Centre

Charman, D. (1999) Issues and impacts of using computer-based assessments (CBAs) for formative assessment. In: Brown, S. Bull, J. and Race, P. (eds) Computer-Assisted Assessment in Higher Education. Kogan Page, 85-93

Haladyna, T M. (2004) Developing and Validating Multiple-Choice Test Items. Lawrence Erlbaum Associates, Inc., Publishers

Pritchett, N. (1999) Effective question design. In: Brown, S. Bull, J. and Race, P. (eds) Computer-Assisted Assessment in Higher Education. Kogan Page, 29-37

Thelwall, M. (1999) Open-access randomly generated tests: assessment to drive learning. In: Brown, S. Bull, J. and Race, P. (eds) Computer-Assisted Assessment in Higher Education. Kogan Page, 63-70

 

Get Support: 

The Digital Learning Team can support you with using digital tools for teaching and learning.