What characterises “quality” in ethics education?

I recently read Ercan Avci‘s 2017 paper Learning from experiences to determine quality in ethics education (International Journal of Ethics Education 2:3-16). Avci, from Duquesne University, conducted a literature review looking for shared characteristics in peer-reviewed, full text articles with “ethics education”, “ethics teaching” or “ethics learning” in the title and “ethics” or “ethics education” in the keywords during the period 2010-2015 (which the author describes as the “the last five years”, though it looks like six years to me). A total of 34 papers were examined, drawn from 11 academic disciplines and 10 countries (plus 3 international studies). As one might anticipate, the USA was the most represented geographical context, and healthcare (Nursing, Medicine, etc) was the discipline with the highest number of studies. I was a little surprised to see that none of the reports were from the UK.

As the author himself points out, this is a rather eclectic mix of settings. This might be spun either as an advantage (e.g. capturing diversity) or as a limitation (when it comes to drawing universal lessons). Notwithstanding these issues, Avci makes a number of important observations, some of which resonate with my own experience (e.g. see the Notes for the Tutor section, p16 onwards, in my contribution to the 2011 book Effective Learning in the Life Sciences).

AVCI

Taking a step back, there is an initial question before examining the quality of any ethics programme, namely is ethics being taught at all? It is apparent that many courses – even in Medicine, even in the States – do not include a formal ethics component. However, a broad range of subjects are now including some ethics in their teaching. Continue reading

Advertisements

Pedagogic Journal Club 101

In preparation for a journal club I’m leading shortly, I was reflecting on some generic starter questions which could be applied to reading any paper (this would be true in the context of reading an article on your own as well).

To start with, you can apply the 5W1H approach. In the context of reading a journal article I tend to take these in a non-typical order:

  • Who? Who conducted the research?
  • Where? Was it one institution only or a multi-centre project? UK, USA or elsewhere?
  • What? What, briefly, was the main point of the work [you will look in finer detail later on]?
  • When? Not only when was the work published, but when was the work actually conducted? This is especially pertinent if the article is describing the impact of technical innovations.
  • Why? What are the reasons the authors give for conducting the work? These may be generic and/or driven by particular local developments.
  • How? This is the nitty-gritty and will take on the bulk of a journal club discussion.

As part of the “how” there are additional key questions to bear in mind as you work step-by-step through the paper. These are:

  • What key information are we being presented in this section of the paper?
  • What key information are we *not* being presented in this section of the paper?

In both pedagogic research articles and scientific papers these two questions are particularly valuable when examining information that has been presented in figures and/or tables. Sometimes necessary background details to follow the implication of displayed data have to be found elsewhere in the text, and sometimes they are missing entirely (at which point you need to decide for yourself whether this an accidental or a deliberate omission).

For a journal club specifically you also need to remember that it is intended to be a discussion not a presentation of what you have found; you are the guide as you lead a band of intrepid explorers below the surface of the paper. If the journal club is working well you will come away from the process with additional insights they have made about aspects you missed in the text.

The NSS and Enhancement (Review)

Coverage of the findings from the recent, new style, National Student Survey drew my attention to the Making it count report for the Higher Education Academy, coordinated by Alex Buckley (I’m afraid I’ve lost details of who pointed me towards the report, so cannot offer credit where credit is due).

make it countMaking it count is not new, it was published by the HEA in 2012, and therefore predates both the new-NSS and the introduction of the TEF. Nevertheless I found it a fascinating and worthwhile read – hence this reflective summary.

As most readers of this blog will know, the UK National Student Survey was introduced in 2005 and draws inspiration from the Australian Course Experience Questionnaire (CEQ), which had been in use since the early 1990s. From inception until 2016 there were a standard set of 23 questions in the NSS (see this link for complete list). The questions were all positively phrased and students in their final year were invited to respond using a standard five-point scale from “definitely agree” through “mostly agree”, “neither agree or disagree”, “mostly disagree” to “definitely disagree”  (“not applicable” was also an option). Following extensive consultation, the questions were changed for the first time in 2017. A total of 27 questions were included, with some original questions retained, some rephrased and some brand new added (see this link for 2017 questions). Continue reading

When is the right time to stop taking antibiotics?

Press coverage has picked up on an interesting paper The antibiotic course has had its day published in the British Medical Journal (online 26th July 2017). The paper was of interest to me as I studied antibiotic resistance for my PhD, and this topic was also the theme of (to date) my only appearance on TV news.

bmjab

As anyone who has ever been prescribed antibiotics ought to know, current clinical practice from the World Health Organisation and others recommends completion of the course (often 7 days), even if the patient feels better sooner. The justification for this strategy has been concern that premature ending of treatment might allow the disease-causing bacteria to recover and continue to wreak havoc, possibly in a newly-resistant manner.

In the new paper, Martin Llewelyn (Brighton and Sussex Medical School) and colleagues from a number of institutions in South-East England question the basis of this recommendation. Whereas the link between exposure to antibacterials and the development of resistance is well documented, these authors wondered about the origins of the original advice. They suggest that the requirement to “complete the course” probably stands on little more than the anecdotal experience of some of the antibiotic pioneers. Continue reading

Capturing more than lectures with “lecture capture” technology (paper review)

The July 2017 edition of British Journal of Educational Technology includes a pilot study The value of capture: Taking an alternative approach to using lecture capture technologies for increased impact on student learning and engagement investigating the potential to exploit lecture capture technologies for the production of teaching resources over and above recording of lectures per se.

BJET

I was keen to read this paper because I am already using Panopto (the same software used in the study) as a means to generate short “flipped classroom” videos on aspects of bioethics which, it is hoped, students will watch before participating in a face-to-face session. I have also produced some ad hoc materials (which author Gemma Witton terms “supplementary materials”), for example to clarify a specific point from my lectures about which several students had independently contacted me. Furthermore, I have also written some reflections on the impact lecture capture is already having on our courses (see Reflecting on lecture capture: the good, the bad and the lonely). Continue reading

Headline Bioethics

I have mentioned the Headline Bioethics project here previously, including links to a poster I presented at the Leicester Teaching and Learning event (January 2013) and again at the  Higher Education Academy STEM conference (April 2013).

A paper giving more details about the task was published last week in the journal Bioscience Education. The abstract states:

An exercise is described in which second year undergraduate bioscientists write a reflective commentary on the ethical implications of a recent biological/biomedical news story of their own choosing. As well as being of more real-world relevance than writing in a traditional essay format, the commentaries also have potential utility in helping the broader community understand the issues raised by the reported innovations. By making the best examples available online, the task therefore has the additional benefit of allowing the students to be genuine producers of resources.

This is not, incidentally, to be confused with the other activity I’ve been doing with a different cohort of second year students in which they produce short films about bioethics (the paper on that subject is forthcoming).

 

Fundamental flaws in (cancer) research

Watching a TED talk by Ben Goldacre recently, my attention was drawn to an excellent Nature article on fundamental flaws in cancer research. The Comment Raise standards for preclinical cancer research (subscription required), by Glenn Begley and Lee Ellis, discusses some systematic weaknesses in basic biomedical research and proposes some solutions that would resolve some of these problems.

Nature 483:531–533 (29 March 2012) doi:10.1038/483531a

Nature 483:531–533 (29 March 2012) doi:10.1038/483531a

As part of their work at the Amgen pharmaceutical company, the authors have tried to replicate the findings in 53 “landmark” papers reported to reveal important advances in understanding about the molecular biology of cancer. Despite their best efforts, including contacting the scientists responsible for the original studies, getting resources from them and, in some cases, visiting their labs to repeat the protocol there, Begley and Ellis only managed to successfully reproduce the published results in 6 (11%) of cases. We are not told which experiments were replicable, or perhaps more importantly which were not, since confidentiality agreements had been made with several of the original authors (a point that was made post hoc in a clarification statement). Continue reading

  • Awards

  • November 2018
    M T W T F S S
    « Oct    
     1234
    567891011
    12131415161718
    19202122232425
    2627282930