Assessments can be an extremely valuable learning tool, but could there be any problems with how they are implemented?
Select all the correct options
- Many assessments only exist as a box-ticking exercise
- Sometimes assessments are written as an afterthought
- Sometimes assessments are poorly written and it’s obvious that the long options are correct, like this one for example
- Distracters are often poor, making the assessment too easy
- There are no problems with how assessments are applied (see option 4)
Too often assessments are added to courses for, essentially, compliance purposes only – to ‘prove’ that employees have completed the course and passed a test and that the organization has met its obligations. Or they are added as an afterthought, because everyone knows that an e-learning course should have an assessment.
It’s a shame because when assessment strategies are given the care and attention they deserve, they add real value. Assessments can (and should) be an integral part of a learning strategy – giving learners the opportunity to demonstrate that they have understood the key learning points, and can apply that knowledge in real world situations, to show that they really are competent.
These days, we can use assessments very flexibly, so it’s worth giving the issue some serious thought. For example – could it be used as a pre-test to help learners identify areas of focus (and so potentially cut down seat time)? Should randomization and question banking be used? (and if so, could the assessment be re-used periodically as a refresher? Could the assessment be created as a simulation to reflect real workplace practice? Might it even be best delivered via mobile?
There are more questions and options than I’ve listed here, but as starting point, here are some ideas to consider when building assessment into a learning strategy:
Defining your assessment strategy early
The time to be thinking about the assessment is right at the start of a project when defining the course objectives. Learning objectives and assessment strategy are more intertwined than many people realize.
As Robert F. Mager stated in Preparing Instructional Objectives (1984):
“[An] important reason for stating objectives sharply has to do with finding out whether the objective has, in fact, been accomplished. Tests … are supposed to tell instructors AND students whether they have been successful in achieving the course objectives. But unless objectives are stated clearly and are fixed in the minds of both parties, tests are at best misleading; at worst, they are irrelevant, unfair, or uninformative.”
More recently, Cathy Moore has added a great deal of value to this field with her work on action mapping. This is a useful technique that helps make learning more focused on achieving a business goal or meeting an organisational need. The next step is identifying what people need to do to reach that goal, and designing activities that help people practice the relevant behaviours. If you want to find more about action mapping, it’s definitely worth checking out her blueprint tool.
Applying your objectives to your assessment
Using action mapping, or similar goal and outcome focused techniques, you can reach a clear view of the behaviours that you want to support learners with and the underpinning knowledge that they might need.
It is that behaviour, thinking and decision-making that will give a true indicator of learner performance if it is used to test learners early on in the assessment.
As an added benefit, having a clear understanding of what you’re testing against makes writing questions a whole lot easier. If you’re having big problems writing questions for an assessment it should set alarm bells ringing – it could be an indicator that either the learning objectives aren’t defined clearly enough, or that the content of the learning isn’t meeting those objectives.
In fact, it’s often wise to start off creating the assessment (which you might also use as a form of diagnostic to test learners understanding) before doing anything else. Knowing what you will test learners on will make planning the underpinning course much more straightforward.
Those are some thoughts on the big picture, now time for a little on the practicalities of writing assessments. Here are a few tips for writing good questions:
- The best questions give learners the opportunity to demonstrate that they can apply the knowledge they’ve gained in real life situations. So often they are scenario-based. You may need to work closely with a subject matter expert to create realistic contexts to achieve this.
- Ask yourself – does it matter if a learner gets the question wrong? For example, a question in a course about protecting information might be: ‘What year did the Data Protection Act come into effect?’ Does it really matter if learners know this? No it doesn’t. It matters that they understand how the Data Protection Act relates to their job and what steps they need to take to ensure they are compliant, so an appropriate question should be testing that.
And finally, some on writing good answers:
- In many assessments learners can score very highly by just choosing the longest option every time (because the reality of the situation is often nuanced, meaning extra clauses get inserted to make the answer accurate). Check through your options to make sure the longest option isn’t always the correct one.
- It’s all too easy to accidentally make the second option correct all the time. Once you’ve finished scripting, have a quick check and make sure the correct options are mixed up. Or, even better, have your code mix them up for you.
- Avoid questions where all the options are correct. Likewise, avoid an ‘all of the above’ option.
- Make sure sure the wrong answers are plausible, otherwise it’s just an exercise in elimination.