I’m excited to say that my book Biological Determinism, Free Will and Moral Responsibility: Insights from Genetics and Neuroscience is being published this week.
There are 5 chapters, in which I have attempted to pull together threads from moral philosophy, from law and from neuroscience to examine the growth of Neurolaw. Around the world, notably the USA and Italy, an increasing number of defendants are appealing to their genes or issues with the structure and function of their brain as mitigation for their crimes. To what extent should we allow this, now or in the future?
- Free will and determinism: an overview of some of the main schools of thought regarding the “free will problem” – Libertarianism, Compatibilism and Hard Determinism.
- Existing legislation on mental disorders and criminal cases: automatism, criminal liability, diminished responsibility, “disease of the mind”, insanity, mens rea and M’Naghten.
- Biological basis of behaviour: background on behavioural genetics and the use of various brain imaging techniques to investigate the extent to which our behaviour might be “hard wired”.
- Use of genetic and neuroscientific evidence in criminal cases: a brief history of neurolaw. Summarises many of the key cases in which scientific evidence has been proffered by in criminal cases as (partial) justification of the behaviour of the defendant.
- Are we ready for an expanded use of neuroscientific evidence in the courtroom?: In which I caution that the current use of genetic and brain physiology evidence is, at best, premature and uncertain.
My day was not scheduled to include a spot on News24
As I headed into work on Wednesday 2nd July, I had no idea that by the time I came home that evening I would have done two live interviews at New Broadcasting House, headquarters of the BBC.
I’ve done several radio interviews previously and have been in discussion with makers of The Big Questions on at least three occasions about appearing on that show (one of which, tellingly, ended when the researcher declared I was “a bit too in the middle on the issue”). However this was to be my first experience of being on television.
I was due to have an admin splurge in my office, before a scheduled trip to London in the afternoon for a trustees’ meeting. The news that morning had included an announcement by David Cameron that there a new review was to be set up, looking into ways to tackle antibiotic resistance (see Antibiotic resistance: Cameron warns of medical ‘dark ages’).
I give final year undergraduate lectures on antibiotic resistance, so it is a topic about which I maintain an active interest. I was piqued by this announcement since it smacked of the Prime Minister climbing aboard the growing movement to tackle the problem (which IS serious, in case you were in any doubt), and because a call for a review inevitably means it will be even longer before actual steps are taken. The need for new antibiotics was known 20-odd years ago when I was doing a PhD on resistance to a major class of antibacterials and since then the situation has got worse, not better. Continue reading
It seems that November is shaping up as a bit of a European tour for me. Trips later in the months to Naples and Edinburgh have been on the cards for a while, but my friend and colleague Salvador Macip and I ended up popped to Alzira, Spain on November 8th for 24 hours. This unusual behaviour was prompted by our success in winning the European Prize for the Popularization of Science.
This was the 19th year that the European Prize for the Popularization of Science has been awarded
Like many colleagues, I quite often give talks for sixth form groups about recent developments within my subject specialism. There are plenty of good reasons for doing so: sharing enthusiasm for your discipline; encouraging prospective students to go to university (ideally your University); bring students, and their teachers, up to date on the latest developments in the field.
However, it is in regard of the last of these points that I’ve had increasing concern. These worries are prompted by my experience marking past papers completed by my son during his recent round of exam revision. In science subjects in particular the markschemes are very prescriptive and inflexible, they don’t seem to allow for a candidate to expand upon the expected points. There is no room for crediting knowledge over and above faithful regurgitation of the core content. That would be bad enough, but my bigger concern is that introducing the students to knowledge which more up to date than the specifications might actually lead them to give a rich and factually correct response penalised because it disagrees with the anticipated answer.
What content might fall into this trap? The most obvious examples would be developments in stem cell biology, especially innovations associated with induced pluripotent stem cells. Granted this work has now led to a Nobel Prize, but I expect many markers will not have kept pace with the field. Similarly, other areas of genetics may have moved faster than the “official” A level line.
I will continue to give lectures for schools, the benefits definitely outweigh the risks, but I do carry this gnawing worry. Maybe an examiner out there can put my mind at ease about this (maybe not).
Earlier today I had the privilege of attending* the annual Sluckin Memorial Lecture given by eminent Oxford neuroscientist and academic blogger Professor Dorothy Bishop. Dorothy’s theme was ‘Developmental dyslexia and other neurodevelopmental disorders: Distinct syndromes or part of normal variation?‘. There was much in the talk worthy of blogging here, but since I’ve got a stack of final year dissertations to mark I will, for the moment, limit myself to reflections on one point that she raised.
Slide 19 in a presentation by Dorothy Bishop available on Slideshare (click image for link)
As with many conditions in the genomic era, there is a desire to find the underlying genetic ’cause’ for dyslexia. This search is not without justification. For example, classic comparison of monozygotic twins (“identical” twins, i.e. same genetics, notwithstanding any epigenetic influences) and dizygotic twins (“non-identical twins”, no more genetically related than any brother or sister) strongly implies that there is a genetic component to dyslexia.
There is stronger evidence than this, particularly for a correlation between dyslexia and the catchily name gene DCDC2. A 2005 paper in the Proceedings of the National Academy of Science, a “Premier League” academic journal, showed a link between specific mutations in this gene and reading disability. A subsequent paper by Tom Scerri and colleagues (including Dorothy) found that a particular Single Nucleotide Polymorphism (a SNP, i.e. a particular base change difference in the DCDC2 gene) was associated with 31% of dyslexics. It was also found in 23% of the control (i.e. non-dyslexic) group, but nevertheless the difference the two is statistically significant (p = 0.005). Continue reading
I’m a big fan of both science and television and have blogged previously about their inter-relationship (e.g. Science on the telly and A new model for interaction of scientific research and TV?). I was therefore very interest to hear Physicist and former pop star Prof Brian Cox delivering the 2010 Huw Wheldon lecture on the topic Science: A Challenge to TV Orthodoxy (available on BBC iPlayer until December 8th).
As the presenter of the excellent Wonders of the Solar System, Cox is ideally positioned to examine the tensions between science and television which, he notes, is “the primary medium for the dissemination of scientific knowledge to the non-specialist public” (01:25).
Brian Cox is Professor of Physics at the University of Manchester
There are, Cox notes, incompatibilities between the goals of science and television; though he is keen to emphasise that these are “occasional” and that he does not subscribe to the model that there are serious deficiencies in TV’s coverage of science. For example, a practising scientist must never have an eye on the audience, which would de facto compromise the impartiality of the process (08:10). In contrast, television must have their viewers (and reviewers) in mind. Continue reading
Prof Spiegelhalter gave a very thought provoking and informative talk at the University of Leicester
There is ample evidence that humans are frequently bamboozled by statistics, and interpretation of risk factors is an area in which this is most apparent. As an example, research from a number of countries showing that about a quarter of the population cannot correctly answer the question “Which is the highest risk factor: 1 in 100, 1 in 1000, or 1 in 10?” (Galesic and Garcia-Retamero, 2010).
This was just one insight during a fascinating lecture on Quantifying Uncertainty given by David Spiegelhalter, Winton Professor of the Public Understanding of Risk at the University of Cambridge. Spiegelhalter, a Bayesian statistician, began his talk with two quotes which he said were very useful in setting an appropriate perspective:
“probability does not exist” Bruno de Finetti
“all models are wrong, but some are useful” George Box
In other words, probability is not an intrinsic property of the outside world, but something we apply to it. Risks associated with a number of activities can be compared using a standardised unit the micromort, invented by Stanford University statistician Ron Howard and defined as a 1-in-a-million chance of dying. For example – how far can you travel by different means for a risk of one micromort? Answer: driving = 250 miles, cycling = 20, walking = 17, motorcycling = 6, hang-gliding = 8, scuba-diving = 5 and skiing = 0.5. Comparisons of this sort lay at the heart of David Nutt’s famous assertion that taking ecstasy has the same risk as horse-riding.