picture
Individual

Statistical snake oil, again

Or: where is Darrell Huff when you need him? The Chronicle of Higher Education, drawing on the services of Academic Analytics LLC, presents lists of departments and institutions ranked by “productivity”. Here are the 2007 and 2006 rankings for philosophy:
Philosophy Departments · Productivity 2006 and 2007a.jpg
The numbers indicate by how many standard deviations a program exceeds the mean. Source: Chronicle of Higher Education. Left column: 2007. Right column: 2006.
The measure is based on statistics concerning publications, grants, awards and honors, and so forth. These are normalized and weighted to yield the composite scores you see above. It should be clear that although the scores are significant, the rankings aren’t. They’re too volatile. Only three departments manage to remain in the top ten from 2006 to 2007.
huffhowtolie.jpg
It’s true that in sports the standings from one year to the next can vary just as much. They, however, are based on the unimpeachable won-lost record. A perusal of the puffery for the “Faculty Scholarly Productivity Index™ (FSP Index)” shows that arbitrariness enters into the formulation of the Index not only in the weighting of its various components but also in the methods used to calculate those components. One book is given the weight of five articles, and so on. Institutions can buy the raw data. But I wonder how many administrators, pressed for funds, will do so, and spend more money to have the data analyzed again. Yet that’s what you’d need to do to know how robust a measure the FSP is.
Unfortunately, the numbers will be used to make distinctions they cannot rightly be said to justify. Academic Analytics claims that “more universities than ever are using FSP on their campus”, and I believe them. What I don’t believe is that the FSP is as objective as they claim. Carnegie-Mellon, which ought to know better, highlights its no. 5 ranking in 2007; but a year earlier, as you can see, they weren’t even in the top ten. Michigan State’s index plunged from 2 to below 1.31 in one year. Were they exhausted after their stellar season?
It’s too bad that the Chronicle is lending its prestige to this dubious enterprise. An antidote to the FSP can be found at the International Mathematical Union which has produced a Report (pdfbar.png) on citation statistics. Here are the conclusions:
  • Relying on statistics is not more accurate when the statistics are improperly used. Indeed, statistics can mislead when they are misapplied or misunderstood. Much of modern bibliometrics seems to rely on experience and intuition about the interpretation and validity of citation statistics.
  • While numbers appear to be "objective", their objectivity can be illusory. The meaning of a citation can be even more subjective than peer review. Because this subjectivity is less obvious for citations, those who use citation data are less likely to understand their limitations.
  • The sole reliance on citation data provides at best an incomplete and often shallow understanding of research—an understanding that is valid only when reinforced by other judgments. Numbers are not inherently superior to sound judgments.
The last item, I think, may be irrelevant to the people who are most likely to be customers of Academic Analytics. I have in mind administrators or the people who hire them, people who think that a university should be run like a business. The point of appealing to the FSP and measures like it is to avoid judgment, substituting for it the authority of numbers.
(For more statistical snake oil, see “How to mislead with statistics”, 7 Dec 2004.)
Sources:
A ‘Nixon going to China’ momentGeomblog, 14 June 2008.
Abraham Mahshie. “Former executive spooks some, but not all, facultyColumbia Daily Tribune, (21 Dec 2007) p. 1A. Also here (pdfbar.png).

LinkJune 18, 2008 in Academic Affairs · Society

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d834515f2769e200e5537a59cc8834
Weblogs that refer to “Statistical snake oil, again”: