It is widely accepted that a well-constructed programme of assessment can help to structure and drive learning. Digital technology provides an opportunity to innovate and introduce novel mechanisms for assessing our students - however we can also consider moving traditional formats online. Akimov & Malin reinforce this approach, and raise important discussion about how the online environment has created new demands on the construct and content of assessments:
Not only should assessment adhere to the principles of validity, reliability and fairness, but it should also involve multiple measurements, and include both formative and summative tasks. Furthermore, when designing online assessments, instructors must also take into consideration issues of accessibility and legality, identity security and academic integrity. These requirements are particularly relevant for institutions that seek accreditation from professional and educational bodies
When designing digital assessments we would encourage all educators to revisit the Graduating European Dentist recommendations for contemporaneous teaching, learning and assessment, AMEE's Consensus Framework for Good Assessment & AMEE's Guide to Online e-assessment
Differences between online and traditional assessments are discussed by Rovai - and issues of validity, reliability and dishonesty and discussed by further by Gikandi, Morrow & Davis
Fuller also discusses the various opportunities to enhance assessment using digital technologies.
Formative assessments are genrally carried out as a method of providing longitudinal, or ongoing feedback to the learner. With such a variety of online resources, there are many ways in which you can engage and enthuse students.
Gikandi, Morrow & Davis provide a comprehensive review of the use of online formative assessment, and draw attention to some important points:
- A mixed methods approach is often required to establish a degree of validity and reliability with online formative assessment
- Assessment activities should be authentic to the topic being studied
- Feedback from asynchronous learning events must be delivered quickly - and online systems should be designed to facilitate this. Click here to visit a PDF from Surrey University (UK) about opportunities for feedback with digital learning.
- Simple and well-designed assessment rubrics should be made available to the learners
- The assessment value of e-mail messages, chat room conversations, and discussion board postings should not be underestimated
- Flexible assessment tasks can support learner autonomy and motivate learners, for instance, by providing a variety of choices or open-ended tasks
- Adequate learner support is critical, and the teacher should be responsive to the diversity and needs of individual learners in asynchronous learning environments
Systems such as Blackboard can be very useful for keeping online formative assessments in one place, with the ability to track user engagement and performance. Guidance for setting up tests and surveys in Blackboard is available here. Similarly, online data collection systems such as Microsoft Forms are a quick and easy way to engage students in self-assessment activities. The latter can be embedded within both synchronous and asynchronous learning activities. Find out more about using Microsoft Forms here.
There are a number of examples within the DigEdDent library of online 'clinical' examinations. Whilst operative tasks are unable to be executed online, a number of soft clinical skills (such as history taking, problem solving, decision making and communicating) can be assessed reliably. In this way, the OSCE (Objectively Structured Clinical Examination) can be converted to an online format. See this online toolkit for virtual OSCE examinations.
The OSLER (Objectively Structured Long Examination Record) provides an opportunity to more comprehensively check knowledge, attitudes and clinical reasoning. Whilst both formats are being increasingly used, a degree of software infrastructure is required in order to 'channel' students through the examination stations. This is often by using 'breakout rooms' in Microsoft Teams or Zoom.
There are important considerations for online assessments that use patient-sensitive and time-sensitive information, and teachers should be mindful of both issues surrounding patient consent, and confidentiality of cases, before releasing material online.
Inceasingly teachers are using video-enhanced observation (VEO) to record students within a clinical environment. With this form of technology, it is possible to observe and record student performance within a clinical environment. Behaviours can be checked off in real time against a pre-defined marking matrix, and results are instantly available for review and analysis. See here for an example of this type of software.
VEO doesn't need to be confined to student performance - it has also been used successfully to record interactions between students and clinical teachers as part of a peer review of teaching programme. In fact, this approach can be used any time you would like to map physical behaviour to an objective marking scheme.
There is little reason why traditional written examinations cannot be delivered effectively online - and in fact the digital nature of the submissions makes analysing, distributing, marking, moderating and storing the student data more manageable. However, the type of question and type of software for delivering the assessment should be chosen carefully. The two main concerns with online written assessment relate to the physical placement of the student:
- Do they have access to information that will help them to obtain an advantage when answering the question?
- Is the person sitting the assessment really who they say they are?
A number of software solutions exist that account for these two concerns - both by 'locking' down a student's computer, and tracking their behaviour during the examination. This monitoring is called proctoring - and it can be costly for your institution. Some examination packages have this feature built in, or available as an add-on. An example of proctoring using webcams is available here.
Instead, you may consider altering your format of written online assessment to reduce the impact of these concerns. One such way is using an 'open book' approach. If done correctly, this can work very well for assessing higher cognitive skills (instead of knowledge recall).
When students are surveyed, they almost always prefer open book assessments as opposed to closed book assessments, despite the acknowledged understanding that open book examinations generally require the exhibition of higher order thinking skills - Brightwell et al
Importantly, moving a written assessment online does not immediately make it 'open-book'. As such, online written examinations should be written and standard set very carefully. It is not sufficient to simply alter the grade boundaries to account for the fact that students have access to a wider library of information.
Consider putting together written examinations that require students to:
- Apply knowledge and write explanations
- Analyse and order information
- Synthesise ideas or plans
- Evaluate, appraise or reflect
This is often easiest when based around case-vignettes or scenarios. In this regard, the MSA/SSA (multiple/structured short answer) or longer essay exam type is the preferred choice. See Bloom's rose for ideas about assessed tasks, and this paper by Chris Moore about ensuring authenticity when designing an open book examination.
Open book online examinations may be best-suited to testing higher order cognition - however there is still a need to test knowledge recall and simpler modes of knowledge application - certainly in the earlier stages of the students' learning journey. In this regard, SBA (Single best answer), EMI (Extended matching item) or MCQ (multiple choice question) exam types might be considered the most appropriate.
It is also possible to make use of online oral assessment to efficiently cover a lot of knowledge-based material - and there are examples of this within the DigEdDent library. It is important to ensure that the marking rubric is objective, and that the standard is set carefully - however there is also an opportunity to include other soft skills that are required within the healthcare environment:
- Communication skills
- Problem solving
- Explanatory skills
implementing oral examination in an assessment strategy has led to enhanced communication skills, knowledge and confidence among students, has increased their motivation to learn and understand the subject matter, and has decreased their likelihood of cheating - Akimov & Malin
When planning online oral assessments, it is worth also considering the levels of anxiety and stress that inexperienced students may experience, alongside the need for a robust online connection during the assessment. The factors are discussed further within the paper by Akimov & Malin