Adjusting “exams” as they move online

Universities across the world are having to adjust to the fact that rooms full of students sitting exams is not an appropriate assessment format for May and June this year. As a consequence, teaching teams are needing to think laterally about how to interrogate students about the learning they have gained from their modules.

celebrity squares

Online meetings have started to look like episodes of the old “Celebrity Squares” gameshow

I am sure many places are way ahead of us on this one, but a few reflections on a recent teaching and learning committee meeting (held via zoom) may be of benefit to those who are just getting going in their thinking about this.

 

  1. Duration of tests and the length of time they are accessible. The amount of time that a test is “live” and the time for which an individual student can respond are not necessarily the same thing. A defined two hour period is not applicable for a number of reasons – including potential timezone differences and connectivity problems. My institution has mandated a “24 hour window” for assessments and it is our understanding at the chalkface (as it was) that this means that they are live throughout that time.
  2. Factual recall questions aren’t going to work. There has, of course, been a long-standing debate about the educational merits of an over-reliance of questions that reward regurgitation of factoids rather than probing higher learning skills. However, a move to remote (ie unsupervised) assessment of students who have ready access to Google* makes this format of question entirely redundant (*other browsers are available).
  3. Students need examples of any new style of questions. A decision that MCQs are not going to be appropriate is only half of the story. Introduction of radically different types of questions is going to require not only production of the actual paper but also additional specimen questions for students who will not be able to draw on past papers for guidance.
  4. Clear and timely instruction. Students are going to need clear guidance before the day of an exam, reiterated in the “instructions” section of the assessment itself. There will be all sorts of practicalities about submission as well as the questions themselves about which students will need advice.
  5. Essay questions will need word limits. For all manner of reasons the standard “three essays in two hours” format is not going to work. More time is inevitably going to mean more words. We all hope that essays represent carefully constructed and reasoned arguments in response to a specific question. Sadly the reality can sometimes be “brain dumps”, in which any material matching identifiable keywords in the title is served up for academics to sift through. A longer time will just allow for more of this unstructured stream of consciousness.
    Even taking a less jaundiced view, a good student is going to be tempted to offer far more material in support of their answer than they would realistically have managed in the typical exam scenario. If we cannot restrict the available time, then another option is to impose a word limit. Having looked at past answers, a suggestion of 1200 words for a 40 minute question (i.e. 30 words per minute) has been floated. The emphasis on “quality over quantity” needs to be emphasised – more is not necessarily better. Of course there may be a minority of student who would have written a longer essay that this, but even they will benefit from tailoring their material as a response to the specifics of the question.
  6. Plagiarism detection, and other “course essay” regulations, are back in play. The kind of measures being considered as “reasonable adjustments” in this unprecedented scenario are much more akin to coursework essays. We aspire to have novel synthesis presented in exam essays, but in the past we would not have penalised faithful regurgitation of material from lectures and other sources. Now, however, there is the very real danger of copy and paste plagiarism from lecture notes, from books and articles, or indeed of collusion between students. The requirement to use plagiarism detection tools is therefore going to be essential. Similarly, students will be able to drop in images taken from sources. Whereas in a constrained exam format we might not have worried about their origins, appropriate citation will need to be factored into marking criteria.
  7. Practicalities about format of paper set and submission requirements also need to be clear. It is not just the content of the questions that need addressing, but also aspects of the delivery of the paper and the safeguards students need to put in place regarding submission. For example it is likely that a paper will be distributed as a Word document, which is actually more accessible than many other potential formats. We know, however, that some elements of layout can be altered during the submission of word documents (eg positioning of images) and so we would probably recommend saving as a PDF before submission (much as we would usually for coursework).

This is not an exhaustive list, and you may instantly spot the flaws in the observations made – if so then do please let me know. I am very conscious that this is prepared in the context of a science program and that other disciplines may see things differently. But I hope these notes will be helpful for at least one or two of you.

 

 

Time to call on BoB for help?

The unprecedented COVID-19 pandemic has forced a radical rethink regarding so much of “normal life”. The cessation of face-to-face teaching at universities is just one of the areas in which people are having to investigate workarounds to usual practice. One of the interesting observations over this past week has been the sudden adoption of technologies which have been around for a while, but have never quite found the level of engagement that they deserved.

bobhomeIf you are a UK academic or student, I would like to add another example into the mix – the online TV and Radio repository BoB (https://learningonscreen.ac.uk/ondemand sometimes known as “Box of Broadcasts”).

BoB has been around in a variety of formats for about a decade. It currently has over 2.2 million copyright-cleared broadcast programmes available to stream, and over 120 universities and colleges are subscribed – if you are at a UK university you probably have access already but never knew it!

I am a massive enthusiast for BoB, and for the related Television and Radio Index for Learning and Teaching (TRILT). For full disclosure I am a Trustee of Learning on Screen, the organisation that runs both resources – but that role stems from my enthusiasm for their potential not vice versa!

I have written several articles and run workshops about ways BoB can be used in teaching. Rather than repeat myself now, here are a couple of pertinent links:

  1. An article Boxing clever in Times Higher Education from 2014 (link)
  2. A seminar Do you know Bob? Adventures with technology-based resources for teaching (and beyond) which I ran in in April 2019 (link)
  3. Another talk As seen on TV: Using broadcast media in university teaching from December 2018 (link) – there are inevitable overlaps between this and (2) but they were tailored for different audiences, so probably worth looking past the initial similarities – the “back end” of the two talks are rather more diverse.
  4. My work in this area was also written up as a case study (link)

In the recent past, Learning on Screen have started to develop subject-specific playlists. A few of us have also tried to co-ordinate disciplinary blogs, which allow for more description of the content and discussion of potential applications. As an example, see our Biology on the Box site.

These are tricky and uncertain time, but I’m hoping that when we get a chance to look back after the storm, the appropriate rise of online tools for pedagogy will be one of the plus points to emerge from the tragedy.

 

Some tips for developing online educational repositories

As part of my work enthusing about the use of broadcast media in teaching, I am in the process of writing a guide to the use of Learning on Screen’s Box of Broadcasts resource. However my reflections on this project, coupled with the development of other blog-based resources such as Careers After Biological Science, set me thinking about some more generic recommendations for anyone thinking of setting up an online collection of educational resources. These crystallised quite naturally into a series of questions to ask oneself about the purpose, scope and authorship of the materials.

On the advice of a couple of colleagues, I submitted this to the Association for Learning Technology blog. I was delighted when they accepted it, since members of that community are likely to be developing similar resources. My self-check questions can be found via this link.

altcblog

 

The NSS and Enhancement (Review)

Coverage of the findings from the recent, new style, National Student Survey drew my attention to the Making it count report for the Higher Education Academy, coordinated by Alex Buckley (I’m afraid I’ve lost details of who pointed me towards the report, so cannot offer credit where credit is due).

make it countMaking it count is not new, it was published by the HEA in 2012, and therefore predates both the new-NSS and the introduction of the TEF. Nevertheless I found it a fascinating and worthwhile read – hence this reflective summary.

As most readers of this blog will know, the UK National Student Survey was introduced in 2005 and draws inspiration from the Australian Course Experience Questionnaire (CEQ), which had been in use since the early 1990s. From inception until 2016 there were a standard set of 23 questions in the NSS (see this link for complete list). The questions were all positively phrased and students in their final year were invited to respond using a standard five-point scale from “definitely agree” through “mostly agree”, “neither agree or disagree”, “mostly disagree” to “definitely disagree”  (“not applicable” was also an option). Following extensive consultation, the questions were changed for the first time in 2017. A total of 27 questions were included, with some original questions retained, some rephrased and some brand new added (see this link for 2017 questions). Continue reading

Reflecting on lecture capture: the good, the bad and the lonely

reflect-logoWe have been using lecture capture for about two years. I have to say at the outset that I am a big fan. Having said that however, there are aspects of lecture capture that I find problematic. Here I offer some quick and dirty reflections on my experience of lecture capture so far. This is not a scientific study, and I certainly haven’t gone away and done an extensive literature search, so I may well be rediscovering old truths.

The Good. There are many attractive features of lecture capture. These include:

  • Availability for review and revision. This is, of course, the main raison d’être of lecture capture, but it is important not to overlook the value this provides – students can go back over the sections that were unclear the first time.
  • Similarly the recordings can be used by those with legitimate cause to be absent (e.g due to illness, away sports fixture, etc)
  • Recordings can be useful for the lecturer themselves. We know that the first time you prepare a set of lectures you are likely to have recently read around the subject and be naturally “on top” of your material. The second year can be a different challenge – the slides are in the can, but you may not recall some of the wider points you had made to embellish the on-screen text and images. Listening back to recordings of your own lecture from the previous year can help to fill in the blanks.
  • The recordings can also be useful when we have to provide a substitute due to lecturer illness. A few year back, before we had our official lecture capture system, I had to take a semester off due to ill health. Fortunately I had audio recordings which could be provided to my “stunt double” along with the slides. Officially captured lectures can now fulfil this role.
  • In times of absolute need the recording can be officially made available in lieu of the live session. We had to use this route when a colleague was ill during the last week of a semester – there was no time to warm up a replacement and rescheduling was not feasible, so we actually showed a recording of the previous year’s equivalent lecture. I “hosted” the session and was really encouraged by the large proportion of the class who turned up in a 5pm slot, knowing that a recording was going to be aired (and that it was already available to them via the VLE).
  • Recordings can be built into reflection to help improve one’s own teaching or as part of an informal peer review process.
  • The tools for lecture capture can be used to pre-record material as a contribution to a “flipped teaching” model.
  • Lecture capture software (certainly the Panopto tool we use at Leicester) includes remarkably powerful inbuilt stats on usage by students. This can shine light on the aspects of a lecture that they felt needed clearer explanation.
  • You can change the speed of the recording. This might be slowing it down slightly for better note-taking, or it might be speeding it up (one of my students confessed that they like to listen to lectures by a colleague at a quickened pace because the lecture naturally delivers their material at an unduly leisurely pace).

Continue reading

Putting the moving image to work in biochemistry education

The December edition of the Biochemical Society magazine The Biochemist has historically taken a slightly less serious look at some aspect of the subject. This year the focus is Biochemistry on Screen. Articles include discussion of Star Trek, Jurassic World, Contagion, Spiderman and others. I contributed a piece about the different ways that use moving image (especially TV) can be used in Biochemistry education. A copy can be accessed via this link.

Update: the article is now also mirrored on the website of ERA, the Educational Recording Agency (accessed via this link).

The December 2015 edition of The Biochemist focuses on screen representations of Biochemistry

The December 2015 edition of The Biochemist focuses on screen representations of Biochemistry

But is it any good? An information literacy tutorial

At the Higher Education Academy STEM conference in April 2012, I presented a poster offering an outline into a blended-learning tutorial we have produced in order to help undergraduates develop their abilities to evaluate the academic merit of different resources they might find on the internet. The tutorial involves the students working individually to critique eight specially chosen online sources presented as the results of a search on the topic of “mitochondria”. This is followed up by a group tutorial in which the quality and relevance of the materials are discussed more fully.

To see a pdf version of the poster, click on this image

  • Awards

  • April 2020
    M T W T F S S
     12345
    6789101112
    13141516171819
    20212223242526
    27282930