In keeping with other universities, Leicester is thinking through the best way to offer feedback to students on their exam performance. Several different models are under consideration, each of which involves significant logistical difficulties, and no firm decision has been made about the approach to take.
We do not (yet) offer students in bioscience the opportunity to see their marked scripts themselves. At various times in recent years I have acted as “broker”, looking through the exam answers of personal tutees in order to suggest ways that they might improve their future answers.
In truth, the findings of this distinctly unscientific study are not earth-shattering. There are three or four key explanations as to why students fail to achieve the mark they desire:
1. Not answering the question asked – lowest marks, and those that come as the biggest surprise to the authors, are for answers in which they have significantly missed the thrust of the question. There is not much that can be done to counsel against this, except the old mantra “read the question carefully”.
2. Not offering enough detail – The most common error in exam answers is a failure to include sufficient detail to be worthy of the mark the author anticipated. This issue can be addressed in practical ways, e.g. by making the expectations overt and by exposing students to real essays written by previous students (as we do in this exercise).
3. Unstructured “brain dump” – This can be a manifestation of error #1, possibly leading to #2 (in the sense that a lot of time and effort has been spent on material which is not strictly relevant). Essays of this type are frequently characterised by flitting back and forth between different issues as the author thinks of them. For the examiner it comes across as either a lack of true understanding about the topic in hand, or as a lazy student essentially saying “I recognise this word in the question, here’s a random collection of things I recall from your lectures, you work out whether any of it is relevant”.
Encouraging students to take a few moments to sit down and plan out their answer before dashing off into the main body of their response will help to reduce this mistake.
4. Illegible handwriting – I speak as someone whose hand-writing is not fantastic on a good day and which gets worse under time constraints. Nevertheless, regardless of weaknesses in the marker, the old adage “we can’t award credit for it if we can’t read it” remains true.
Scripts that are completely undecipherable are rare, but they do exist. In the past we were able to offer some glimmer of hope in the form of the opportunity to call in the student and, at their expense, to employ someone to write a readable version as the original author translated their hieroglyph. This back-up position has now been removed, apparently because a student at a different institution significantly embellished their original version whilst it was being transcribed [note: does anyone have chapter and verse to show that this is not an urban myth? I’d like to know the details].
Given that other mechanisms for capturing an exam answer exist (e.g. use of a laptop) it ought not to be the case that someone gets to the point where they discover after the event that their legibility was inadequate. It is therefore important both that we have the chance to see some hand-written work (preferably early in a student’s time at university) and that a culture is engendered in which it is acceptable to comment on poor handwriting, even if the hand offering the rebuke is not itself ideal.
December 19, 2013
Categories: assessment, education, exams, feedback . Tags: essay writing, legibility, time management, writing skills . Author: Chris Willmott . Comments: Leave a comment