Research transparency and the Emperor’s new clothes

I’m currently reading Neil Manson and Onora O’Neill’s excellent Rethinking Informed Consent in Bioethics (about which there will be much more on this site when I’ve finished it). One of the issues that arises is not only that you have obtained consent in an appropriate way (a first-order informational obligation) but also that you have collected evidence to show that you have done so (which is a second-order informational obligation).

This set me ruminating about the modern obsession with generating paperwork to ‘prove’ something. This is probably most acute in my thinking because I am also monitoring my hours this week for the “Research Transparency Exercise”. The process has different names in different institutions but, as I understand it, the idea is that all academics in all UK HE institutions record their exact use of time for one week during each calendar year.

My question is – what’s the point? I know it’s a cliche, but the nature and duration of my commitments in any two weeks can differ wildly, so what meaningful data is going to arise from extracting information from any one of those and pretending it is typical? For staff with a heavy teaching commitment the difference between term-time and vacation-time monitoring is going to be particularly stark.

During term-time I would routinely work 50+ hr weeks (I’m not proud of that, it is probably not good for me or for my family, it just happens to be a fact). This week, the one where I’m counting my minutes, the University has been closed for two days for Easter and whilst not formally on holiday I’m having to fit in a certain amount of sprogwatch around my wife’s commitments. This is not a typical week.

Now, it’s only human nature that I’m more aggrieved that I might have been as the data will appear to show a lack of commitment. In reality, of course, I ought to be equally affronted if I’d be called upon to document a 65 hr week. What is one to do? As a scientist I’ve tended to take these things seriously and to produce real data – should I instead be pre-determining the hours and distribution in my best-guess, averaged, homogenised week?

This is where the absurdity of exercises of this type becomes most apparent. Sadly this foolishness is not confined to academia; there are professions conducting surveys of this type where the hours you are reported to work must hit precisely the hours that you are contracted to work – no more, no less. Under these circumstances it is (allegedly) common practice to start at the answer required and contrive the data to fit. What on earth is the point of that?

Now then, 30 minutes spent writing a bolshy blog post – which box of the research transparency form does that go in?


  1. Yes, but how else can we draw up league tables? ;-)

    • If this data goes into league tables (as opposed to being lost into a filing cabinet never to surface again) then that surely makes their lack of validity all the worse

      • League tables aren’t about validity, they’re about position.

  2. Do we ever see any of this data at an institutional level? I’d be really interested to know if Annette or anyone is using it to see what amount of time we officially declare is devoted to teaching in relation to the proportion of total university income it accounts for.

  3. […] review: Rethinking Informed Consent I mentioned in a previous post that I was currently reading Neil Manson and Onora O’Neill’s book Rethinking Informed […]

Comments RSS TrackBack Identifier URI

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s