PLoS Biology finally has an impact factor!
Measures of Impact, from PLoS Biology
…why did anyone submit great work to a journal that didn’t even exist yet, from a publisher with no established reputation? The answer is that it was on the strength of promises made by our in-house editors and academic editorial board to uphold high standards and rigorous peer review, to launch an open-access alternative to the best journals, and to drive a transformation in scholarly publishing. On that promise, more than 250 authors published the 30 research articles that composed our first three issues. And it is on the basis of those first three issues that Thompson ISI has calculated a 2004 preliminary impact factor for PLoS Biology of 13.9.
Interestingly, an IF of 13.9 blows every journal in LIS out of the water; according to the latest (2003) numbers from ISI, the LIS publication with the greatest IF is ARIST, weighing in at 2.864. I don’t have a good frame of reference for this though, since I don’t know whether PLoS Bio‘s IF is high compared to non-LIS print journals, or more specifically compared to other bio journals. Someone who knows more about citation analysis have any insight into this? Either way, though, 13.9 seems pretty impressive to me.
Though it shouldn’t surprise anyone that an open-access journal has a high IF: Lawrence tells us that articles freely available online are more highly cited. And since IF is essentially a measure of citation frequency, it is rather a straight line. Still, it is nice to see it demonstrated.
I’ve been waiting for a long time for a PLoS journal to be around long enough that an IF could be computed. One of the major (though certainly not the only) reasons that scholars publish is that the reward system in academia requires it. Journal publication is a large (arguably the largest) part of the reputation economy in academia. And there’s a hierarchy of status among journals. If a journal doesn’t have an IF, then there’s no easy (read: quantitative) way to assess the karma generated by publishing in it. In that case, a journal has to rely on having a good reputation, but even that will only get you so far: my colleagues may know that I, for example, am publishing in well-regarded journals without IFs, but if the Provost, for example, has never heard of them & doesn’t know how to evaulate them, it won’t do me much good come tenure review-time. So, why don’t more academics publish in open-access journals? Cynical answer: because most open-access journals don’t have an impact factors.
Well, now one does, & it blows many established journals out of the water. It’s only a matter of time before more open-access journals have IFs computed. (Of course, D-Lib has had an IF for a few years.) Seems to me that this must be a big part of any argument for open-access publishing: it operates in the same reputation economy as traditional publishing (operates better, in fact, than most traditional publishing), and can do so with rigorous peer review.
What follows is from John MacMullen, who for reasons passing understanding, still can’t get my authimage authentication to work in any of his 4 browsers:
“the LIS publication with the greatest IF is ARIST”
…which is completely paper-based; no online access (purchase or OS), no abstracts, nothing. The irony becomes even greater when you consider that it is a review publication which means a significant percentage of its page count goes to, yep – lists of citations….
“Someone who knows more about citation analysis have any insight into this? [biomed IFs] Either way, though, 13.9 seems pretty impressive to me.”
It is impressive, relative to ILS. But then, just about anything is. 🙂
Here are some examples of IFs of well-known biomed journals, and some less well-known to laypeople:
Title IF Total cites (2004)
New England J of Med: 38.570 159,498
Science: 31.853 332,803
Nature: 32.182 363,374
JAMA: 24.831 88,864
Ann Rev Immunol: 52.431 14,357
Cell: 28.389 136,472