àËÅöÊÓƵ

Skip to main content
Centre for Online and Distance Education

Supporting Student Success: Designing Assessment

Date

Written by
Daksha Patel and Anita Skinner

This year’s Supporting Student Success workshop organised by CODE took as it’s theme: Fulfilling the potential of Online and Distance Education. The second of four parallel sessions looked at two very different approaches to online assessment.  There were two presentations in this session: the first presented by CODE fellows Professor Stylianos Hatzipanagos and Professor Alan Tait and the second by Leonard Houx of Bayes Business School, àËÅöÊÓƵ. 

Designing Plagiarism out of Assessment

Professor Hatzipanagos and Professor Tait considered the debate on assessments following the Covid-19 driven changes.  They presented findings from the CODE evaluation of the emergency jump to online timed assessment at the àËÅöÊÓƵ in 2020 and 2021, focussing on the positive changes in successive years  and identifying where we could go next, recognising that there was diminished interest in returning to pre-pandemic assessment methods. 

The CODE surveyed and interviewed students and academics (examiners and programme directors) in 2020 and again in 2021.  In 2021, learning from the 2020 experience, additional instruction and training for students was developed and made available prior to online timed assessments. Students and staff were more comfortable with the format.  The digital shock had lessened, partly due to better information and training for students from the UoL and from Course directors and partly because of developing digital skills. Both survey cohorts recognised advantages of flexible assessment, being better for mental health with less stressful environments, more accessible for many and with no additional travel needed.

Describing the measures taken to address concerns about academic integrity (honesty, trust, fairness, respect, responsibility, courage and misconduct), they discussed what worked and what didn’t, and how we can design assessments to promote academic integrity. A strong theme was the positive effects that this new form of assessment has for student wellbeing and pedagogical development.

The forced switch to online assessment had to happen rapidly in 2020, as a result assessments in the first year did have some technical problems. There were concerns that plagiarism and collusion would rise, and award value could drop because of the loss of student verification and invigilation.  

In 2020, it was quickly obvious that technical solutions were of limited value (for proctoring or moderating). Yes, software can detect text similarity and was used widely, detecting a worrying rise in essay mills and the use of CHEGG (a private company providing answers).  Online proctoring for example met with considerable resistance from students. 

There is an impetus to innovate in assessment practice. Initially assessments were rewritten to test understanding and interpretation rather than recall.  That development has continued, with an accelerated reimagining of assessment beyond the initial switch to online exams. The result has been a lot of redesign for on campus programmes – because it HAD to be done - and less for Distance Education because of staff resource constraints and a growing need for professional development for online solutions.   

Academics also appreciated an increased ability to support students in their learning, but also called for help and training themselves in designing assessment to maintain academic integrity and avoid plagiarism and misconduct. The survey found some subject-specific variation on misconduct. Guidance in the form of induction, short courses, videos, plagiarism were created to familiarise students with academic integrity in online assessment. 

The pandemic has accelerated assessment rethinking and triggered innovation. Online exams for all have also created a desire for closer alignment between campus based and Distance Education assessment.  But we can go further to make assessments more authentic, inclusive, valid and reliable; all aims that have been talked about in the sector for years.  

It’s clear that academics would welcome training in designing such assessments. Look out for CPD on online assessment but also on practically managing academic integrity.  

Test Drive: How automated questions can upgrade your module

In the second presentation Dr Leonard Houx of the Bayes Business School told us of his work using asynchronous quiz questions and how they can aid students learning.

Dr Houx began with a review of the research around testing and learning, identifying 10 ways testing enhances learning (Roedgier, Smith and Putnam 2011), leading on to what for many is the holy grail – testing that engages students in a low cost easily accessible way, which supports them in further, more complex activities.  However, writing good effective quizzes is not easy, a variety of question types are needed, often with relevant examples.  They need to be written in clearly understood language, aligned with taught content and to support later complex activities once facts are established. They have to have credible distractors, with true parallelism and there must be clear feedback. Writing test questions is hard work and a subtle craft. The payback for this time and effort is student engagement which has been shown to aid later retention of information: also active learning, and learning which is meaningful and dynamic. Testing also facilitates the retrieval of knowledge and improves metacognitive monitoring, helping to identify gaps in knowledge to the student and to the teacher.  All this encourages students to study, students are more engaged as they can easily see the rationale for participating.

The presentation triggered much interest and the discussion afterwards went onto think about how quizzes can be moved to the next level by encouraging students to then discuss why the wrong answers are wrong. Or can short answer questions with model answers be used to encourage reflection and self-feedback.
    
When rounding up Professor Stylianos Hatzipanagos commented that the presentations in this parallel session addressed two contested assessment methods and how they can both improve our students’ experience and learning. Exams are often not authentic; with these online, time-restricted assessments we have the opportunity and incentive to build in more authenticity across programmes. And with automated questions as described in the second presentation, we can put in place activities which promote engagement with asynchronous teaching and help students to learn more effectively.