Marking (in)consistency – the elephant in the assessment room?

In September 2006 Banksy (briefly) included a painted "Elephant in the Room" in his LA show

In a thought-provoking article, available online ahead of publication in the February 2012 edition of Assessment and Evaluation in Higher Education, Teresa McConlogue looks into the pedagogical benefits of peer assessment. Her paper But is it fair? Developing students’ understanding of grading complex written work through peer assessment focuses on work conducted with engineering students at Queen Mary University of London.

Two distinct cohorts of students were required to peer assess a piece of coursework, leading to generation of a summative mark; a laboratory report (n=56, 10% of mark for module) and a literature review (n=26, 25%). Each piece of work was assessed by 4 or 5 peers who were required to provide both a mark and comments on the work. The students were then awarded the mean mark.

Thus far there is nothing exceptional about this process – peer assessment is an established practice in Higher Education (see, for example, Paul Orsmond’s excellent guide on Self- and Peer-Assessment). The controversial element of McConlogue’s activity comes with the fact that the authors of the peer-assessed work were provided with all of the comments made by their contemporaries AND a full record of the range of marks awarded. This “warts and all” approach exposed the students to the mechanics of marking – showing them both the reasoning that went into a mark (some of which seemed poorly aligned with the mark awarded or based on ‘trivialities’) and the fact that an individual “rogue” mark may have significantly influenced the mean. In some cases the individual marks awarded apparently spanned  several grade boundaries.

Continue reading

Involving alumni in careers education

The December 2011 edition of Bioscience Education included an account I wrote concerning our Careers After Biological Science (CABS) programme at the University of Leicester. The CABS series of careers talks was started in 2007. Since 2009 it has been supported and enhanced by the Bioscience careers blog which includes copies of the slides used in the presentations, as well as a variety of videos and/or audio recordings.

As the Abstract of the paper states:

Graduate employability is an important concern for contemporary universities. Alongside the development of employability skills, it is also crucial that students of bioscience, a ‘non-vocational’ subject, have awareness of the breadth of potential careers that can follow from their initial degree.

Over the past five years we have developed the Careers After Biological Science (CABS) programme. Former students are invited back to describe their current role and offer practical advice to undergraduates who may be considering moving into a similar discipline. The speakers’ career profiles and associated resources are then collated onto an open-access website for the benefit of the wider community.

This project is characterised by two principal innovations; the pivotal role of alumni in the delivery of careers education, and the integrated use of multiple social media (web2.0) technologies in both the organisation of careers events and development of an open access repository of careers profiles and associated resources.

To read the full article “Here’s one we prepared earlier”: involving former students in careers advice click here.

An instrument to evaluate Assessment for Learning

A&EinHE now has an impact factor

Assessment for Learning (AfL) has been a key notion in recent curriculum developments in both secondary and tertiary education (see this link for previous left-handed biochemist posts on AfL).

The December 2011 edition of Assessment and Evaluation in Higher Education featured a paper Does assessment for learning make a difference? The development of a questionnaire to explore the student response by Liz McDowell and colleagues from the recently-closed AfL CETL in Northumbria. Quoting AfL guru Paul Black, the authors point out that the definition of Assessment for Learning has become overly flexible, “a free brand name to attach to any practice,” before clarifying that for them AfL must encompass six dimensions:

  • Formal feedback – e.g. from tutor comments or self-assessment
  • Informal feedback – e.g. from peer interaction or dialogue with staff
  • Practice – opportunity to try out skills and rehearse understanding
  • Authenticity – assessment tasks must have real-life relevance
  • Autonomy – activities must help students develop independence
  • Summative/Formative balance – involves an appropriate mix of both tasks that are “for marks” and those that are not

The bulk of the paper describes the development and testing of a questionnaire used for evaluation of students’ experience of a module. The questionnaire, which can be downloaded from the AfL CETL website, could be used to provide evidence to justify curriculum change and/or to support the case for quality enhancement. Each of the questions maps to at least one of the six key dimensions.

In analysing the use of this research instrument to evaluate modules at their own institution, the authors highlighted three principal factors distinguishing AfL and non-AfL courses: staff support and module design; engagement with subject matter; and the role played by peer support. Overall they suggest that the student experience was more positive in modules where AfL approaches were employed.

Institutional repositories, social media and academic publication: a simple experiment

Over at Science of the Invisible, my colleague Alan Cann has been reflecting on the contemporary landscape within academic publication. Specifically, he’s been thinking aloud about the role played by institutional repositories alongside (or, more radically, instead of) more formal journal publication (for example, see Wit’s End, which links in turn to Melissa Terras’ post What happens when you tweet an open access paper).

Institutional repositories are playing an increasingly important role in academic publishing

Prompted by Alan and Melissa’s enthusiasm for using social media to promote awareness of published work, in mid-November I started to use Twitter to advertise the existence of some of the papers I have deposited in the Leicester Research Archive (LRA). Some of my tweets were retweeted by others in the community, especially Alan, who also shared some of these within his Google+ circles.

Partway through this process it occurred to me that I had stumbled into a little experiment. So in the end I selectively tweeted about 8 of the 27 documents I currently have in the LRA. Admittedly these were probably the 8 papers that I felt were of most interest to the broader community on Twitter, but this did not mean they had previously received the most hits in the archive. In fact, if you rank the 25 works that had been in the Leicester repository throughout the 6 months (May to October 2011) from most to least popular,  then these 8 were ranked: 4th, 5th, 10th, 12th, 13th, 18th, 23rd and 24th= (2 documents were not added to the archive until November). Continue reading