Embracing the potential of digital exams

Horizons in STEM Higher Education has become a summer fixture for many science academics. In keeping with many 2020 events, this year’s conference switched to an online format. The first plenary lecture was Digital Exams: Transforming Assessment by Professor Mariann Rand-Weaver from Brunel University.

film_strip_1_border_arial_lg1

This was a really inspiring session, demonstrating the opportunities that are presented by actively moving examinations from a traditional paper-based format to using digital platforms.

Brunel has been moving to digital exams over the past few years (i.e. not just in response to recent pandemic), partly motivated by asking themselves why exams were almost the only time we ever ask students to hand write anything these days. Additionally, a switch to provision of computers is already a typical “reasonable adjustment” on accessibility grounds, so why not level the playing field by allowing this for all?

There are a range of benefits. For the markers it is easier to read typed scripts. The answers are often better structured, as the students can move around the text to the relevant point. There are also administration benefits – e.g. less paper , less manual handling, more streamlined processes. Continue reading

7 tips if your exam has turned into an online assessment

Faced with the sudden closure of campuses, many student expecting to have traditional summer exams in a sports hall or similar venue, are now facing online assignments with revised regulations. The following post offers tips for students facing such a scenario. Note it has been written in the first instance for Bioscience students at the University of Leicester, so some of the specific details may need to be adapted to your personal context.

  1. 24 hours does not mean 24 hours. 24 hour periods have been set for each assessment, but you should not expect to work solidly on it for all of that time. The broad time window is to allow for connectivity, accessibility and/or time zone issues; remember that the intention is to be close to the original exam format.
  2. Revise before the assessment day. Although it is an “open book” format, aim to do the relevant revision in the days before the assessment. Ideally you want to know roughly what you intend to say without looking anything up, and just consult sources to confirm the details.
  3. Stick to the word limit. Since you don’t have the usual 2-3 hour time constraints, a word limit has been applied. The word count has been calculated by looking at past papers to see how much previous students managed to say in exam essays. The limits have been set at, or even slightly above, the length of good answers – you really don’t need to be writing more than this.
  4. Quality > Quantity. As always, the quality of content remains more important than writing “enough” words – i.e. have you answered the actual question asked, in sufficient depth and in a well-constructed essay (e.g. with an introduction and a conclusion?)
  5. Save your answers, and back them up. Remember to save your answers regularly during the day – you don’t want to lose all your work. Save every time you lean back in your chair, go to get a drink or go to the loo. If you don’t automatically have saving to the cloud activated then make back-ups periodically as well. Check whether the system has been set to allow upload of single or multiple versions. If an assignment has been set up to accept multiple submissions before the deadline, you can upload a version of your answer once you’ve got sufficient to make it worthwhile but remember to replace with the final version before the deadline (see point 7).
  6. Use your own words. Remember that answers are going through plagiarism detection software so don’t cut and paste from a source into your answer, not even whilst you are working on it and intend to rephrase later! If you are reading from one or more sources, close them, write what you want to write, then check you got it correct.
  7. Don’t miss the deadline. Aim to submit well in advance of the stated deadline. On this occasion late submission = 0% not a sliding scale of penalties. (And if you are in a different time zone make sure you have the correct time in mind). Know in advance what the procedures are if you experience connectivity issues.

 

Adjusting “exams” as they move online

Universities across the world are having to adjust to the fact that rooms full of students sitting exams is not an appropriate assessment format for May and June this year. As a consequence, teaching teams are needing to think laterally about how to interrogate students about the learning they have gained from their modules.

celebrity squares

Online meetings have started to look like episodes of the old “Celebrity Squares” gameshow

I am sure many places are way ahead of us on this one, but a few reflections on a recent teaching and learning committee meeting (held via zoom) may be of benefit to those who are just getting going in their thinking about this.

 

  1. Duration of tests and the length of time they are accessible. The amount of time that a test is “live” and the time for which an individual student can respond are not necessarily the same thing. A defined two hour period is not applicable for a number of reasons – including potential timezone differences and connectivity problems. My institution has mandated a “24 hour window” for assessments and it is our understanding at the chalkface (as it was) that this means that they are live throughout that time.
  2. Factual recall questions aren’t going to work. There has, of course, been a long-standing debate about the educational merits of an over-reliance of questions that reward regurgitation of factoids rather than probing higher learning skills. However, a move to remote (ie unsupervised) assessment of students who have ready access to Google* makes this format of question entirely redundant (*other browsers are available).
  3. Students need examples of any new style of questions. A decision that MCQs are not going to be appropriate is only half of the story. Introduction of radically different types of questions is going to require not only production of the actual paper but also additional specimen questions for students who will not be able to draw on past papers for guidance.
  4. Clear and timely instruction. Students are going to need clear guidance before the day of an exam, reiterated in the “instructions” section of the assessment itself. There will be all sorts of practicalities about submission as well as the questions themselves about which students will need advice.
  5. Essay questions will need word limits. For all manner of reasons the standard “three essays in two hours” format is not going to work. More time is inevitably going to mean more words. We all hope that essays represent carefully constructed and reasoned arguments in response to a specific question. Sadly the reality can sometimes be “brain dumps”, in which any material matching identifiable keywords in the title is served up for academics to sift through. A longer time will just allow for more of this unstructured stream of consciousness.
    Even taking a less jaundiced view, a good student is going to be tempted to offer far more material in support of their answer than they would realistically have managed in the typical exam scenario. If we cannot restrict the available time, then another option is to impose a word limit. Having looked at past answers, a suggestion of 1200 words for a 40 minute question (i.e. 30 words per minute) has been floated. The emphasis on “quality over quantity” needs to be emphasised – more is not necessarily better. Of course there may be a minority of student who would have written a longer essay that this, but even they will benefit from tailoring their material as a response to the specifics of the question.
  6. Plagiarism detection, and other “course essay” regulations, are back in play. The kind of measures being considered as “reasonable adjustments” in this unprecedented scenario are much more akin to coursework essays. We aspire to have novel synthesis presented in exam essays, but in the past we would not have penalised faithful regurgitation of material from lectures and other sources. Now, however, there is the very real danger of copy and paste plagiarism from lecture notes, from books and articles, or indeed of collusion between students. The requirement to use plagiarism detection tools is therefore going to be essential. Similarly, students will be able to drop in images taken from sources. Whereas in a constrained exam format we might not have worried about their origins, appropriate citation will need to be factored into marking criteria.
  7. Practicalities about format of paper set and submission requirements also need to be clear. It is not just the content of the questions that need addressing, but also aspects of the delivery of the paper and the safeguards students need to put in place regarding submission. For example it is likely that a paper will be distributed as a Word document, which is actually more accessible than many other potential formats. We know, however, that some elements of layout can be altered during the submission of word documents (eg positioning of images) and so we would probably recommend saving as a PDF before submission (much as we would usually for coursework).

This is not an exhaustive list, and you may instantly spot the flaws in the observations made – if so then do please let me know. I am very conscious that this is prepared in the context of a science program and that other disciplines may see things differently. But I hope these notes will be helpful for at least one or two of you.

 

 

When technology models poor practice

Year1 assessmentOne of the difficulties in teaching first year students is to convey the importance of appropriate handling of data, both in terms of data display and degrees of significance. I’ve commented previously on this site about times when technology can produce utterly inappropriate graphic representation of results (see A bonus lesson in my data handling tutorial).

At the end of the first semester we conduct an online exam using the Blackboard quiz tool. The assessment is out of 200, marked automatically and scaled to a percentage. When the students submit their answers at the end of the test, they get  instant reporting of their result. The screenshot on the right shows a section from the gradebook where the results are recorded in exactly the detail each students gets, i.e. up to 5 decimal places! It is unfortunate that this inappropriate “accuracy” gets displayed to the students.

“Please send a photo”

streetrunning2

One recent email exchange related to someone else’s order for running shoes, sent to me in error

I’ve recently had cause to contact three different companies about inadequacies in their service. The reasons for doing so in each case were very different, but there was a common thread to their replies: “Please send a photo of the [relevant item]”. When the third request came in, I started to see a pattern and this set me ruminating on why they were adding this extra step to dealing with my query.

And then it struck me, that this was exactly the reason – it was an extra step. It is part of a filtering process. It is easy enough for all and sundry to fire off email requests willy-nilly. As a mechanism to weed out the serious appellant from the time-waster there needed to be an additional hurdle. [I have vague memories from school history lessons that monasteries used to offer a similar process. Potential novices were never admitted at their first attempt, they were required to return on several occasions before securing entry into the monastic life.]

I mention this here, on my education blog, because I actually operate a similar system when it comes to requests from students. If you are involved in academia I am sure you recognise emails, particularly as exams loom, that go something like: Continue reading

Feedback on exams: how much of an issue is it?

In keeping with other universities, Leicester is thinking through the best way to offer feedback to students on their exam performance. Several different models are under consideration, each of which involves significant logistical difficulties, and no firm decision has been made about the approach to take.

We do not (yet) offer students in bioscience the opportunity to see their marked scripts themselves. At various times in recent years I have acted as “broker”, looking through the exam answers of personal tutees in order to suggest ways that they might improve their future answers.

In truth, the findings of this distinctly unscientific study are not earth-shattering. There are three or four key explanations as to why students fail to achieve the mark they desire:

1. Not answering the question asked – lowest marks, and those that come as the biggest surprise to the authors, are for answers in which they have significantly missed the thrust of the question. There is not much that can be done to counsel against this, except the old mantra “read the question carefully”.

2. Not offering enough detail – The most common error in exam answers is a failure to include sufficient detail to be worthy of the mark the author anticipated. This issue can be addressed in practical ways, e.g. by making the expectations overt and by exposing students to real essays written by previous students (as we do in this exercise).

3. Unstructured “brain dump” – This can be a manifestation of error #1, possibly leading to #2 (in the sense that a lot of time and effort has been spent on material which is not strictly relevant). Essays of this type are frequently characterised by flitting back and forth between different issues as the author thinks of them. For the examiner it comes across as either a lack of true understanding about the topic in hand, or as a lazy student essentially saying “I recognise this word in the question, here’s a random collection of things I recall from your lectures, you work out whether any of it is relevant”.

Encouraging students to take a few moments to sit down and plan out their answer before dashing off into the main body of their response will help to reduce this mistake.

4. Illegible handwriting – I speak as someone whose hand-writing is not fantastic on a good day and which gets worse under time constraints. Nevertheless, regardless of weaknesses in the marker, the old adage “we can’t award credit for it if we can’t read it” remains true.

Scripts that are completely undecipherable are rare, but they do exist. In the past we were able to offer some glimmer of hope in the form of the opportunity to call in the student and, at their expense, to employ someone to write a readable version as the original author translated their hieroglyph. This back-up position has now been removed, apparently because a student at a different institution significantly embellished their original version whilst it was being transcribed [note: does anyone have chapter and verse to show that this is not an urban myth? I’d like to know the details].

Given that other mechanisms for capturing an exam answer exist (e.g. use of a laptop) it ought not to be the case that someone gets to the point where they discover after the event that their legibility was inadequate. It is therefore important both that we have the chance to see some hand-written work (preferably early in a student’s time at university) and that a culture is engendered in which it is acceptable to comment on poor handwriting, even if the hand offering the rebuke is not itself ideal.

  • Awards

  • September 2020
    M T W T F S S
     123456
    78910111213
    14151617181920
    21222324252627
    282930