本期“自然”评论文章: 言过其实的“影响因子”。



所有跟贴·加跟贴·新语丝读书论坛http://www.xys.org/cgi-bin/mainpage.pl

送交者: Latino2 于 2005-6-27, 10:49:43:

作者统计了前两年发在“自然”上的文章的影响因子, 认为:

1. 只有少数发表在“自然”上的文章真正引用率高, 其它大多数文章,影响因子并不大。

2. 引用率高的文章并不代表其科学水平很高, 如去年引用率最高的是“老鼠基因组”,其实是一个大工程的总结。

3. 各学科引用率率差别很大,不可一概而论。

结论: 行政管理上夸大了“影响因子”的作用。

========================================
Not-so-deep impact

Research assessment rests too heavily on the inflated status of the impact factor. Every year at the end of June, scientific publishers’ eyes turn to Philadelphia, where the Institute for Scientific Information (ISI) releases a snippet of data that they crave: the impact factor of each journal. In due course, bureaucrats in research agencies will roll
the impact figures into their performance indicators, and those scientists who worry about such things will quietly note which journal’s number wins them the most brownie points.

Attempts to quantify the quality of science are always fraught with difficulty, and the journal impact factors are among the few numbers to persist. The result is an overemphasis of what is really a limited metric.
To obtain the latest impact factors, which were released last week, the ISI number-crunchers added the total number of citations from all the monitored journals during 2004 to items in the journal of interest that were published in 2002 and 2003. They then divided
that total by the number of ‘citable items’ — loosely, papers and review articles — that were published in the journal during those same two years.

The impact factor is taken by some administrators as a measure of the typical citation rate for the journal. But for many journals, it isn’t ‘typical’ at all. Nature’s latest impact factor is 32.2, an increase on last year and a high number that we’re proud of, but it’s one that merits a closer look.

For example, we have analysed the citations of individual papers in Nature and found that 89% of last year’s figure was generated by just 25% of our papers.

The most cited Nature paper from 2002–03 was the mouse genome, published in December 2002. That paper represents the culmination of a great enterprise, but is inevitably an important point of reference rather than an expression of unusually deep mechanistic insight. So far it has received more than 1,000 citations. Within the
measurement year of 2004 alone, it received 522 citations. Our next most cited paper from 2002–03 (concerning the functional organization of the yeast proteome) received 351 citations that year. Only
50 out of the roughly 1,800 citable items published in those two years received more than 100 citations in 2004. The great majority of our papers received fewer than 20 citations.

These figures all reflect just how strongly the impact factor is influenced by a small minority of papers — no doubt to a lesser extent in more specialized journals, but significantly nevertheless.

However, we are just as satisfied with the value of our papers in the ‘long tail’ as with that of the more highly cited work. The citation rate of our papers also varies sharply between disciplines.

Many of Nature’s papers in immunology published in 2003
have since received between 50 and 200 citations. Significant proportions of those in cancer and molecular and cell biology have been in the 50–150 range. But papers in physics, palaeontology and climatology typically achieved fewer than 50 citations. Clearly, these
reflect differences in disciplinary dynamics, not in quality.

The impact factor also mixes citations to diverse types of content: unsurprisingly, review articles are typically the most highly cited, but citations of our Commentaries, News Features and News & Views articles also contribute in a minor way to the numerator (although
these items are not counted in the denominator).
The net result of all these variables is a conclusion that impact factors don’t tell us as much as some people may think about the respective quality of the science that journals are publishing. Neither do most scientists judge
journals using such statistics; they rely instead on their own assessment of what they actually read.

None of this would really matter very much, were it not for
the unhealthy reliance on impact factors by dministrators and researchers’ employers worldwide to assess the scientific quality of nations and institutions, and often even to judge individuals. There is no doubt that impact factors are here to stay. But these figures illustrate why they should be handled with caution.



所有跟贴:


加跟贴

笔名: 密码(可选项): 注册笔名请按这里

标题:

内容(可选项):

URL(可选项):
URL标题(可选项):
图像(可选项):


所有跟贴·加跟贴·新语丝读书论坛http://www.xys.org/cgi-bin/mainpage.pl