Monday, January 10, 2011

Fun with the h-index

The h-index is a widely used measure for a scientist's scientific productivity and impact somewhat more sophisticated than just the number of publications. The h-index is the greatest positive integer number h, such that the scientist has h papers each of which has been cited at least h times. If you're wondering how relevant the h-index is in practice, I have no way of telling in general. I know however that I've been in committees where the h-index evidently was an interesting point of reference for some of its members, and I have also been asked a few times what my h-index is. (Before you ask, according to SPIRES my h-index is either 14 or 16, depending on whether you count all or only published papers.) The absolute number isn't of much importance in most cases, it matters instead how you compare to others in your particular field - as Einstein taught us, everything is relative ;-)

Next time somebody asks for my h-index, I'll refer them to this hilarious paper by Cyril Labbé from the Laboratoire d'Informatique de Grenoble at the Université Joseph Fourier:
    "Ike Antkare, One of the Great Stars in the Scientific Firmament"
    22th newsletter of the International Society for Scientometrics and Informatrics (June 2010)
    PDF here

Labbé has created a fictional author, Ike Antkare, and pimped Ike's h-index to 94. For this, Labbé created 102 "publications" using a software resembling a dada-generator for computer science called Scigen, and a net of self-citations. Labbé's paper contains an exact description of the procedure. His spoof works for tools that compute the h-index based on Google scholar's data; the best known is maybe Publish or Perish.

What lesson do we learn from that?

First, the Labbé's method works mainly because he uses the h-index computed with a quite unreliable database, Google scholar, to which it is comparably easy to add "fake" papers. While for example the arXiv database also contains unpublished papers, it does have some amount of moderation which I doubt 102 dada-generated papers by the same author would get past. In addition, SPIRES offers the h-index for published papers only. (Considering however that I know more and more people - all tenured of course - who don't bother with journals, restricting to published papers only might in some cases give a very misleading result.)

Second, and maybe more importantly, I doubt that any committee that were faced with Ike's amazing h-index would be fooled, since it only takes a brief look at his publications to set the record straight.

Nevertheless, Labbé's paper is a warning to not use automatically generated measures for scientific success without giving the so obtained results a look. Since the use of metrics in science for evaluation of departments and universities is becoming more and more common, it's an important message indeed, and an excellent example for how secondary criteria (high h-index) deviate from primary goals (good research).

For more on science metrics, see my most Science Metrics and Against Measure. For more on the dynamics of optimization in the academic system, and the mismatch between primary goals and secondary criteria, see The Marketplace of Ideas and We have only ourselves to judge each other.

Thanks to Christine for drawing my attention to this study.

17 comments:

  1. Hi Bee,

    H-index fabrications sounds like a replay of the Sokal incident on Quantum Gravity?

    Best,

    ReplyDelete
  2. I agree it has it's issues, but if you were pick a single number that best uncovers your productivity I would think the h-index would be the winner. I can't think of a single number that would be better.

    Maybe, your h-index based on only papers from your last 5 years in order to gauge current productivity.

    ReplyDelete
  3. Cited by at least h different authors would be an extra safeguard.

    With historical data, it should be easy to see how well h correlates with future citations.

    ReplyDelete
  4. Much ado about credit and salesmanship.

    When did quantity trump quality in terms of importance?

    Probably when computer (not Physics) "metrics" became the reason to hire/fire/grade people.

    And when was that, exactly?

    ReplyDelete
  5. Hi Bee,

    I think the relevant point of that experiment can be found in the conclusion itself, with attention to my boldface:

    "This experiment shows how easily and to what extent computed values can be distorted. It is worth noting that this distortion could have been easily achieved using names of real people, thus helping them discretely (sp?) or discrediting them noisily."

    Notice, there is probably an spelling error. It should be "discreetly".

    So that experiment is of course an extreme of a situation that nevertheless could be happening discreetly, building up so that someone could gain an advantage without being noticed or, alternatively, to discredit someone else by the same mechanism.

    Best,
    Christine

    ReplyDelete
  6. Hi Plato,

    It has some similarities, but the point of Labbe's study was less to embarrass the community with nonsensical papers as to show how easy it is to manipulate software-generated measures for scientific success. Best,

    B.

    ReplyDelete
  7. Hi Christine,

    Yes, that's the risk. Whenever there's some software "measuring" scientific success there will be possibilities to cheat. The question is just how much effort does it take, and how easily can it be recognized. In this case, it doesn't seem to take much effort, but it's also easy to recognize. However, one might argue that exactly this sort of cheating is actually quite common already! Granted, people don't use dada-generated papers, but they take apart their papers into as many small pieces as possible and repeat themselves over and over again, thereby increasing their number of publications. Then they cite themselves whenever possible. These are tactics so common already nobody even mentions it anymore. Best,

    B.

    ReplyDelete
  8. Just received the following email and thought it might be of interest for one or the other reader:

    Dear colleague,

    Your blog links to Publish or Perish, a free software program that I developed. It retrieves and analyzes academic citations. Therefore, I thought you might be interested in the book that I have written to accompany the software.

    The Publish or Perish Book: Your Guide to Effective and Responsible Citation Analysis.

    The book contains sixteen chapters (250 pages, 90,000 words) providing readers with a wealth of information relating to citation analysis. It is targeted at individual academics making their case for tenure or promotion, librarians and bibliometric researchers, deans and other academic administrators and anyone wanting to know more about citation analysis, Google Scholar and the Web of Science.

    The book is available both as an electronic version and as a paperback version. For more information on the book's content and how to order it, see:

    http://www.harzing.com/popbook.htm

    If you would like to order the paperback version, you can also go directly to the Amazon product page:

    http://www.amazon.com/Publish-Perish-Book-effective-responsible/dp/0980848512

    Don't hesitate to contact me if you have any remaining questions.

    Best wishes,
    Anne-Wil

    --



    The Publish or Perish Book:
    your guide to effective and responsible citation analysis
    http://www.harzing.com/popbook.htm

    ReplyDelete
  9. Hi Steven,

    Every decent measure has its origin in an actual correlation to scientific success, otherwise people wouldn't be using it. The problem is that once the measure gains relevance the correlation dwindles because people start aiming for the wrong goal which now is the measure rather than good research. That's what I mean with deviation of secondary criteria from primary goals. You could say it's some sort of measurement problem ;-) There's no such thing as a non-invasive measurement, and the problem here is that using the measure itself makes it less useful. Best,

    B.

    ReplyDelete
  10. Hi Bee,

    An interesting metric, yet like all it depends first on ones frame of reference:-) That is one must consider when the data collection begins, what it is that’s found as being data and how it is considered as being such. I would think it better to first ask why we should find such a metric relevant to begin with, as science’s task is what stands as being the best solution respective of moving forward to have nature understood.

    So it’s first to ask, is this best served by measure of its quantity or its quality? That is for me until they find an algorithm which can definitively measure quality, I think all such assessments serve to do is to justify decision, rather than lending reason to them. As far as I’m concerned this methodology of assessment has even less relevance in respect to the betterment of science, as the rankings of the most purchased or recommended recordings does to music.


    Albert Einstein 18
    http://inspirebeta.net/search?ln=en&p=Albert+Einstein&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Roger Penrose 14
    http://inspirebeta.net/search?ln=en&p=Roger+Penrose&f=&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Stephen Hawking 16
    http://inspirebeta.net/search?ln=en&p=Stephen+Hawking&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Richard Feynman 41
    http://inspirebeta.net/search?ln=en&p=Richard+Feynman&f=&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Lisa Randall 47
    http://inspirebeta.net/search?ln=en&p=Lisa+Randall&f=&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Ed Witten 50
    http://inspirebeta.net/search?ln=en&p=Ed+Witten&f=&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Brain Greene 35
    http://inspirebeta.net/search?ln=en&p=Brian+Greene&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Gerard t’hooft 45
    http://inspirebeta.net/search?ln=en&p=Gerard+t'hooft&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Lee Smolin 37
    http://inspirebeta.net/search?ln=en&p=Lee+Smolin&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Leonard Susskind 78
    http://inspirebeta.net/search?ln=en&p=+Leonard+Susskind&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    John S. Bell 5
    http://inspirebeta.net/search?ln=en&p=John+S.+Bell&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    David Bohm 6
    http://inspirebeta.net/search?ln=en&p=David+Bohm&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Erwin Schrodinger 4
    http://inspirebeta.net/search?ln=en&p=E.+Schrodinger&f=author&action_search=Search&sf=&so=d&rm=&rg=25&sc=0&of=hcs

    Best,

    Phil

    ReplyDelete
  11. No exact match found for Gerard t'hooft, using Gerard t hooft instead...

    Even Spires can't spell his name right! And no, I'm not going to try myself, me big sissy today (but I think the apostrophe comes before the t, and the H is capitalized).

    Hi Bee,

    There's no such thing as a non-invasive measurement ;-)

    There isn't? Don't ever say "there's no such thing ... " to an Engineer, we take that personally! :-p (But at least it'll keep us busy for a few days)

    Didn't Lee Smolin discuss some young grad students investigating non-invasive measurements in either TRtQG or TTwP? I forget which and I'm not sure I saw a followup. Jus' kidding, I got the joke.

    Congrats to yourself for getting a paper in just under the wire btw, good timing and we're looking forward to your next one.

    ReplyDelete
  12. I was going to extend Phil's list with the famous Quantum 10, listed below and in order I believe of their world-changing accomplishments, but starting with the first name on the list, I got zero papers, which makes me question the accuracy of Spires, unless there's a cutoff date/century, so I stopped before I started, so to speak.

    Max Planck has ZERO papers?! Well, I don't know why he should, all he did was father quantum mechanics!

    Max Planck
    Albert Einstein
    Niels Bohr

    Louis de Broglie
    Wolfgang Pauli
    Max Born
    Pascual Jordan
    Werner Heisenberg
    Erwin Schrödinger
    Paul Dirac

    I would throw in Hermann Weyl as a quantum 11th, and Feynman too as many such lists exclude Jordan for reasons I will not debate nor agree with. In any event Feynman belongs to the next, also incredible generation.

    ReplyDelete
  13. This whole h index stuff is very similar to what goes on in technical analysis in the financial world. There they try to decipher hidden correlations strictly in the patterns of the markets or in individual securities. The intent is to find patterns of accumulation prior to actual large price moves. It has little to do with the actual fundamentals of the market. And like Bee says, once a pattern is widely recognized as having been previously useful in the past it is pretty much useless after that. I would even go farther and say, at least in the financial world, that immediately in the years following the discovery of a major recognizable pattern something unusual happens: rather than becoming something you can bet "on", it becomes something you should bet "against". This period endures until the the new concensus about it's value once again becomes neutral.

    Something to think about. There are all kind of corrolations between the financial world and strictly mathematical physics, I.e., physics divorced from empirically observed nature. Empirically observed physics is much closer to what could be described as "fundamental analysis" in the financial world.

    So now all of you know how to predict (somewhat) when both physics and the economic predictors are going off the tracks.

    ReplyDelete
  14. Hi. I think that in the definition of h, the word "a" should be replaced by "greatest".
    Best,
    Charles

    ReplyDelete
  15. Hi Charles,

    Thanks, I've fixed that. Best,

    B.

    ReplyDelete
  16. Hello, I wanted to tell you about an plugin to calculate the h-index using Firefox with Google Scholar: https://addons.mozilla.org/en-US/firefox/addon/scholar-h-index-calculator/

    ReplyDelete
  17. The only reason that the h index exists is because academic administrators are lazy. They would rather rely on a number than on developing an understanding of research. Have you ever met a professor who, voluntarily, became a chairperson, dean, president, whatever, without having quit research? Rarely, if ever. Take a look at this Citation Statistics paper.

    ReplyDelete

COMMENTS ON THIS BLOG ARE PERMANENTLY CLOSED. You can join the discussion on Patreon.

Note: Only a member of this blog may post a comment.