LOB-vs
Download Lectures on Biostatistics (1971).
Corrected and searchable version of Google books edition

Download review of Lectures on Biostatistics (THES, 1973).

Latest Tweets
Categories
Archives

Jump to follow-up

The row about redundancies firings at Queen Mary rumbles on.

I’ve already written about it twice in Is Queen Mary University of London trying to commit scientific suicide?, and in Queen Mary, University of London in The Times. Does Simon Gaskell care?. But wait, there is more to come.

The harm done to teaching at Queen Mary was outlined in a report written at the request of Simon Gaskell. He appears to have ignored it entirely. So let’s concentrate on research.

Some explanation of the bizarre behaviour of the Queen Mary management can be gleaned from Queen Mary’s Frequently Asked Questions:
Restructures and Reviews in Academic Departments 2011-12. This says

“research-related metrics” used in the School of Biological and Chemical Sciences are the result of “extensive consultation with staff and include both the Australian [Research Council] journal classification system as well as impact factor”.

Let’s skip over the fact that the "extensive consultation" was largely sham. That’s standard procedure (and not only in universities).

Australian journal classification

The reference to the Australian journal classification is revealing. Australia has been noted in the past for being one of the worst places for abuse of publication metrics. Its 2010 Journal classification was utterly bizarre [download Excel]. It ranks 20,712 journals as being A*, A, B, or C (did the rankers really look at all of them?).

I’ll take one example from my own area: the Journal of General Physiology is probably one of the most-respected journals for electrophysiology with emphasis on mechanisms. Many of its papers are quite mathematical. In its niche, it is hugely respected for the quality of papers and for its high integrity [declaration of interest, I’m an editor, and was hugely flattered by that honour]. But the Australian 2010 list ranks the Journal of General Physiology as B, the same as Brain Research, British Journal of Religious Education and Chinese Medicine (a journal of quack medicine).

In contrast, the Nature journals all get A*, as do many review journals which don’t publish original research at all. But if you can’t get a paper into Nature, there are other A* journals that might help you keep your job, for example Television and New Media, or Tourism Management as well as some joke journals like the Journal of Alternative and Complementary Medicine and Complementary Therapies in Medicine.

No doubt you’ll find similar ludicrous anomalies in your own field.

Simon Gaskell has not answered emails from me, and he hasn’t answered those from his own staff either, but he did emerge briefly from his shell last week, in The Times and at greater length in today’s Times Higher Education. The latter, though longer, consists almost entirely of the vapid self-congratulation which all vice-chancellors feel compelled to spout. Only one short paragraph is devoted to answering his critics.

“Where academic performance has been assessed, it has been important to do so on the basis of objective criteria including metrics – any subjective assessment would be quite unacceptable. These objective criteria were based on generally recognised academic expectations . . . “

I find it almost impossible to believe that a vice-chancellor should be so out of touch as to believe that counting papers, and using impact factors to judge people are “generally recognised”. The extent of the misunderstanding of metrics is illustrated by two statements for Matthew Evans, head of the School of Biological and Chemical Sciences. is quoted as saying:

“Impact factor reflects the number of times an average paper is cited, [so] is a good indication of how many citations a particular paper is likely to achieve,”

Anyone who understands the difference between mode, median and mean of the highly skewed distribution of citations would not make an elementary mistake like that. It is simply statistical illiteracy.

Evans also

"…described metrics as a “vital tool” in assessing academics’ contributions to research and “the only empirical way of measuring success in science”."

Is Matthew Evans not aware that there isn’t the slightest evidence that any metric predicts the future success of a scientist? It’s an evidence-free zone Perhaps he should read about how to test social interventions.

It’s certainly evident that Gaskell’s management team can’t use Google. After getting the hint about the use of Australian rankings, it took five minutes to discover this statement.

"On 30 May 2011, the decision not to use ranked outlets for ERA 2012 was announced".

It’s more than a year since Kim Carr, at that time the Australian minister for innovation, industry, science and research, announced the result of a review of the way the next ce of Research in Australia (ERA) exercise would be conducted by the Australian Research Council (ARC).

“There is clear and consistent evidence that the rankings were being deployed inappropriately within some quarters of the sector, in ways that could produce harmful outcomes, and based on a poor understanding of the actual role of the rankings. One common example was the setting of targets for publication in A and A* journals by institutional research managers.

“their existence was focussing ill-informed, undesirable behaviour in the management of research – I have made the decision to remove the rankings, based on the ARC’s expert advice”.

It seems that Professor Gaskell is unaware of this, since he is enforcing the very “ill-informed, undesirable behaviour in the management of research” that Australia has dropped.

That is what I’d call a major cock-up.

HEFCE also reviewed the role of metrics. Their pilot tests were, almost needless to say, not properly randomised, but the conclusion was much the same as in Australia. Is Gaskell not aware that REF instructions say

“No sub-panel will make any use of journal impact factors, rankings, lists or the perceived standing of publishers in assessing the quality of research outputs”

The very people Gaskell seeks to please have condemned strongly his methods.

If that is not “bringing your university into disrepute”, I don’t know what is.

The responses to Simon Gaskell

This week’s Times Higher Education published three letters in response to Gaskell’s article.

The lead letter was signed by thirteen Queen Mary academics. This gives the flavour.

“Here we point to the unintended consequences of restructuring already in evidence. These include undermining morale in the schools and departments concerned; the flight of talented colleagues to other institutions; the consignment of teaching to lecturers in casual employment or those deemed unfit for research; scandalous gender disparity; and the lopsided, counterproductive allocation of resources. When staff are dismissed, replacements can come only from other institutions that have been willing to invest in people, research and scholarship. As a part of normal academic life, mobility is acceptable, even desirable, but when enforced on the scale envisaged at Queen Mary, it is random slaughter offset by poaching.”

My letter described the cock-up over the ARC Journal classification.

The third letter was from Fanis Missirlis, who was co-authot of the Lancet letter, Queen Mary: nobody expects the Spanish Inquisition. As a result he was fired at one day’s notice. He is one of the academics whose “careers are destroyed by decimal points in spurious calculations”

The Queen Mary process was summed up rather well when the University of Sydney went through a similar convulsion. An Australian academic referred to

"retrenchment exercises driven by “crass, bureaucratic, quantifiable simulacra of genuine research”.

Follow-up

August 23 2012.

I see it is now official. One of the great stars of Queen Mary, Lisa Jardine, is leaving Queen Mary (and, as it happens, coming to UCL). Learn more about her in an interview with Laurie Taylor.

She discovered lost papers by the great renaissance scientist, Robert Hooke –watch the video.

20 December 2012

It seems that the predicted bad effects are coming true. Times Higher Education carries a letter from a biology undergraduate, Matthew James Erickson, which is highly critical of the effect of Gaskell’s policies on teaching at Queen Mary.

“From my perspective as a first-year undergraduate, the aggressive restructuring has had a profoundly negative effect on my opinion of my university.”

It is rather wonderful when a first year undergraduate dares to speak truth to power. Well done Matthew James Erickson.

Print Friendly, PDF & Email

11 Responses to In which Simon Gaskell, of Queen Mary, University of London, makes a cock-up

Leave a Reply

Your email address will not be published. Required fields are marked *

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.