On 8th April 2008, the University of Leicester played host to conference organised by the Centre for Bioscience of the Higher Education Academy (Editorial note: apologies it took so long to get this post up – it was an excellent day conference so I hope you’ll find the material still relevant. More notes can be seen at the official Centre for Bioscience summary of the event).
Cooking the books?
First up was Fiona Duggan from the JISC Academic Integrity Service. Fiona started by highlighting recent discussion in the media about Delia Smith’s book How to cheat at cooking – is it really “cooking” to use frozen mash? Computer games have built in capacity to “cheat”. Are these symptomatic of a change in the acceptability of cheating in society?
Fiona then posed the question “How are students to know what is acceptable practice?”. She cited a post to Deliaonline where a beginner was brave enough to say they didn’t understand what “browning” mince meant. Similarly, students may not know all the ‘obvious’ things we assume that they do. They may have a different take on the ‘rules’; for example, some students argue that referring to online essay banks was simply a way to see model answers since staff had failed to provide examplar material outlining what was expected of them. This may involve some fairly remedial work (from a lecturer’s perspective) and several academic colleagues may feel under-equipped to tackle this sort of thing. Involving librarians and/or information technologists may be useful.
What do instructors and institutions need to do to tackle plagiarism? Several universities have adopted an Enforcement model, punishing wrong-doing, but this does not work if students don’t know what it is that they’ve done wrong. There are also worries about inconsistent practice by institutions across the HE sector. This is only treating the symptoms not the cause.
A second strategy builds plagiarism around an Ethics model – based particularly on the honour-code model in the USA. This is a long-term project, it’s about changing the culture through the institutions – note by comparison how many years it took to change the drink-drive culture in the UK. No empirical evidence on honour codes is available yet from the UK, although work is under way on this approach, particularly at Leicester and Liverpool.
A third strategy could be described as Engineering, that is re-designing our assessments to make them harder to complete by cheating. Don McCabe of Rutgers University has been particularly active in this approach.
Fiona went on to discuss the phenomenon of ‘contract cheaters’, i.e. people who pay for their assignments to be completed by someone else. She highlighted a study on computing students who post their assignments online, with other students bidding to do the work. The most worrying thing was that nearly 50% of students who were doing this did so on more than one occasion. Jude Carroll chipped in at this point to say that it wasn’t simply a last-minute panic – students were posting assignment requests on sites such as RentaCoder shortly after they were set.
We have to look at the way we design our assessments to reduce the possibility. This frequently prompts howls of disapproval from colleagues “It’s such a lot of work” – yes, it will involve more work, but their are lots of ideas out there that you can pinch! People are developing all sorts of engaging and interesting activities, such as games to teach appropriate use of sources and citation practice.
Conclusions from Fiona’s session: students need to be discouraged from using “frozenmash and tinned mince” approaches to their studies. At the same time, lecturers need to be discouraged from using a Blue Peter “here’s one I used earlier” approach to setting assessments. We need to emphasise that we are involved in the process more than the product. We should aim to help students move from being ‘don’t know how’s to being confident users of information. She finished with a quote from the Times Higher Education Supplement (as it then was) from March 2007: “learning practical chemistry shouldn’t simply be about ‘taxing’ the student to follow a long and complex recipe. It should be about making the student think about the techniques they are using and why their chosen technique is superior for that compound”.
Fiona’s slides can be seen here.
Electronic detection of plagiarism
Following on from Fiona, Jo Badge, web resources development officer at the University of Leicester, talked about use of the TurnitinUK software and Safeassign, a second package designed for integration with the Blackboard VLE system. Turnitin was originally introduced into the UK in 2002, having been developed previously in the USA. The system operates by looking for matching text, in a manner analagous to genome matching software. String-searches, of course, do not in themselves prove or disprove plagiarism. Submitted work is held on a national database, in perpetuity. Turnitin can be integrated with Blackboard and responsibility for submission can be devolved back to the student. Turnitin has been in use at the University of Leicester since October 2004. Within biosciences it is routinely used for screening of essays in Years 2 and 3, and laterly for some Year 1 assignments. For final year projects, students are asked to remove their references and images prior to electronic submission; the former to reduce cross-match, the latter to reduce file size. Students can use the quickview feature to check they’ve submitted the right file, and they receive an e-mail to say that submission has been successful. This acts as a receipt and offers reassurance to students concerned that their precious work may have disappeared into the ether never to be seen again.
Safeassign is a buyout of an earlier product Mydropbox. It comes “free” as part of a very expensive Blackboard service, and is likely to become more popular with the roll-out of Blackboard version 8. There are a number of unsatisfactory features about the package at the moment; e.g. students don’t get a confirmation e-mail, and it uses Microsoft windows live search, which is not a brilliant tool. In addition it ppdates searches over time, incorporating materials added more recently in the comparison. Turnitin reports, in contrast, are fixed – if you determine (via the software) that the submitted work is a 20% match to another essay, let’s say, that number won’t have changed next time you come back to Turnitin, but it might if you are using Safeassign and this may be confusing. Side-by-side view, one of the attractive features of Turnitin, is limited in Safeassign and there is no option to remove quotes/references. The description of the comparison as “% probability” is rather confusing.
Jo went on to elaborate about the context at Leicester. The School has approximately 100 taught postgraduates per year and 600 undergraduates. In November 2004 a retrospective pilot study was carried out using samples of work which has coincidentally been submitted electronically in the previous year. This was followed by an initial trial involving 14 modules. These initial studies threw up examples of cut and paste copying, patchwork writing (i.e. sewing together of multiple sources, largely unaltered), close paraphrasing and some collusion. There didn’t seem to be much inter-year copying, although interestingly she has now seen some examples of students reusing their own work in different contexts in different years. Jo reported the following number of cases of plagiariam detected against the number of items test in that year: 2003-04, 11 out of 97; 2004-05 34 of 513; 2005-06, 21 of 1430; 2006-07 63 of (unknown). Scanning of first year work has generated larger numbers, but it is important that the issue arises at that stage since Year 1 does not ‘count’ towards the degree and it allows students to learn about acceptable and unacceptable practice before it has a detrimental effect on their degree performance. A sliding scale of penalties has been introduced based on the severity of the case (how much has been copied) and whether it is a first offence or not.
Jo finished with a number of ‘live’ issues concerning the use of plagiarism detection software – should you allow students to self-check their own work with the software prior to formal submission? Leicester currently does not allow this, other places do. A low percentage match score does not guarantee an essay is original – copying may have come from a password-protected source not available to the comparison software, or it mat have been commissioned by the student. There also needs to be a certain flexibility about what constitutes plagiarism – you cannot apply a strict “above X% is plagiarism, but less is ok” rule, since some percentages, e.g. 10% may be from a single paragraph lifted verbatim or lots of short phrases of technical language which it would be hard to rephrase. The key thing is that a consistent approach is taken.
Jo’s slides can be seen here.
Teaching students what plagiarism is to prevent it
Next up, Maureen Dawson and Joyce Overfield (Manchester Metropolitain University) talked about a project started about five years ago to investigate students and staff understood about plagiarism. This, in turn had led to the production of guidelines for their Department. I won’t say too much more about it here as this project has been written up in the Centre for Bioscience journal Bioscience Education. In essence, they presented students with MCQs and case scenarios, developed from genuine examples, and asked them which examples they considered to be guilty of plagiarism. They stressed the need to regularly reinforce the rules via an online version of the activity (password protected).
Maureen and Joyce’s slides are available here.
Improving scientific literacy to prevent plagiarism
Dorothy Aidulis began by setting the scene for work she has been involved in at the University of Glasgow. She believe that we not only need to clarify what plagiarism is, but also to help students with their understanding of how science works. She has therefore developed scientific writing workshops as a mean to improve scientific literacy.
Dorothy showed a typical definition of science, drawing our attention to the active processes: “understanding… making observations… collecting data… explaining them…” That being the case, why does our teaching of science so often overemphasise regurgitation of a “body of knowledge” at the expense of elaborating the “process of study”?
In Dorothy’s workshop, she starts by issuing students with post-it notes and gives them one minute to write their own definition of plagiarism. After a talk about University regulations and the reasons for those regulaitons, they move onto a set of activities. In the first students are given example texts and highlighter pens and essentially act as “manual turnitin”, picking out sections that they believe are guilty of plagiarism. They then move on to another activity, on summarising texts in their own words, before thinking about appropriate referencing practice. Afterwards, she collects feedback using a “what was something you did not know – something you knew already – something you’ve changed your mind about.” model.
Dorothy’s slides are available here.
Using course and task design to deter students from plagiarism
Jude Carroll, Deputy Director of the Assessment Standards Knowledge Exchange (ASke) CETL at Oxford Brookes University led the day’s interactive session. In her workshop, Jude was keen to get us thinking how we can reduce the amount of plagiarism that occurs by altering the format of assignments we set. She emphasised that it really needs a set of fresh eyes, not the module convenor, to make a course plagiarism-proof. We need to move from “find it” tasks to more “make it” assignments – Google exists, it’s not going away, so setting relatively trivial knowledge-unearthing activities is an invitation for students to copy sources. One example she gave, replace “write an essay on smoking and public health” with “Find three ‘stop smoking’ websites. Create criteria to judge which will best improve public health, rank them and justify the ranking”.
Jude asked us to consider why students commit plagiarism. It is not just wilful desire to cheat, some (possibly most) is inadvertent. Misunderstanding (not knowing the rules), misuse (knowing the rules but not how to fulfil them) and misconduct (knowing the rules and knowing how to fulfil them, but not bothering) can each be the underlying reason for plagiarism. Jude encouraged us to think about our role as educators in “academic apprenticeship” and how we can both ‘design out’ bad practice and ‘design in’ appropriate skills. This includes building opportunities for early diagnosis and feedback on following/failing to follow academic rules and conventions. She stressed the need for excellent written support and guidelines which students can refer back to after a session (and not just online). If we want students to both to read our support materials we need to make sure they look good – this conveys that we are serious about the issues, and encourages them to be invest similar attention. This led into discussion, more generally, of the importance of modelling by academics; do we, for example, apply the sort of citation standards we expect from students when we are producing our module handouts and our powerpoint slides? (in my notes from the event I have a comment here about ‘sheep-dipping’, but have no idea what it means! Perhaps it was to do with getting to the student before they get infected – can anyone who was there help me out?!?)
When it comes to designing out plagiarism, it is important to start early – habits and behaviour rapidly get ingrained and if we feed students a consistent diet of “find and display” tasks then this is what they’ll learn to deliver. If we persist in offering those kind of regurgitation assignments we mustn’t be surprised when we end up with 10% of our classes on the offenders register! Rather than just moaning about the poor planning skills of the modern student, we need to acknowledge the weakness and find ways to tackle it; for example, design in compulsory stages in the task, make it necessary to see a ‘learning log’ regularly and sign it. As mentioned above, review the assessment criteria, emphasise the process not the product. Reward good referencing practice, don’t just penalise poor examples. Make work public and owned, introduce an element of peer review. Vary the tasks between different academic years, rotate them on a cycle of at least three years duration.
Judes’s slides are available here.
The event finished with a series of shorter presentations by conference delegates. Jon Scott (University of Leicester) kicked off with Policy evolution and the elusive grail of consistency.
Jon noted that there had been changes in code of conduct and regulation over the years. There were a variety of drivers in this process: differences in student background, advances in technology with students responding faster, leaving institutions as gamekeeper playing catch-up, and pragmatism. It was an important goal to seek consistency in the application of penalties. Given the number of cases or possible cases of plagiarism being unearthed by plagiarism software, it would be impossible to host the traditional panel meeting for every offence. Biological Sciences at Leicester has therefore moved to a fixed penalty ‘on the spot’ fine for first offences, akin to receiving a letter about a speeding offence. Students can appeal, but 99% take it on the chin, although this is not distinguishing those who are tacitly admitting the offence as against those who can’t be bothered to appeal. Both students and staff need further training, the latter to tackle residual subjectivity in the initial reporting of offences. Jon’s slides can be seen here.
The Second swapshop presentation was by Viv Rolfe from De Montfort University, who shared some of her work Using Turnitin for essay drafts and final submissions. – mandatory for year 1 from Sept 08, short notice! Draft and final. Fun sessions, 3 hours – a piece of deliberate work, missed items? view a draft once – 93 students: 31 plagiairism, 1 collusion. No attempt to correct, blatant – those who did do change, breaking strings, word-swaps, no genuine effort to synthesise. Not improving academic practice! Viv’s slides can be seen here.
Finally, Stuart Johnson demonstrated Don’t Cheat Yourself, his series of Subject-specific online tutorials to help students understand what plagiarism is and how to avoid it. The first tutorial in the series was an adaptation of the paper-based exercise we had developed and published in Journal of Biological Education a few years back. Since then the core activity has been tailored to suit a range of different disciplines by the inclusion of subject-specific examples. Stu’s online tutorials are available here.
The Higher Education Academy Centre for Biosciences event report is available here.
Leave a comment
No comments yet.