The Movement Away From the Journal Impact Factor (JIF)

by | May 11, 2018

POSTED BY

BRIAN WU, PhD, MD

SHARE

The journal impact factor (JIF) was first devised in 1955, and for decades it has been used to rank scientific journals in terms of the number of times they are cited in the literature. However, the JIF has been labeled misleading, misnamed, flawed, confusing and skewed and more and more researchers are pointing out that the JIF has both limitations and biases that are inherent to its calculations. Due to this growing criticism, the scientific community is moving away from this ranking system.

Scientists Seek Alternatives

The scientific community is beginning to mobilize against the continued influence of the JIF. For example, the American Society for Microbiologists (ASM) announced in 2016 that it would remove information on the JIF from its website and journals and to no longer use it for marketing purposes. “To me, what’s essential is to purge the conversation of the impact factor,” declared Stefana Betruzzi, the chief executive of the ASM. “We want to make it so tacky that people will be embarrassed just to mention it.”

Various groups have proposed alternative metrics to replace the JIF. One of the first groups to do this was American Society for Cell Biology. They met in San Francisco in 2012 and later published their Declaration on Research Assessment (DORA), which called for a fundamental change in research evaluation, suggesting that research be judged on scientific merit rather than by using JIFs as “proxies of quality.”

Another alternative was outlined on bioRxiv in the article A simple proposal for the publication of journal citation distributions. This paper, authored by a number of senior employees from leading scientific journals, called for “a move towards greater transparency, one that should help refocus attention on individual pieces of work and counter the inappropriate usage of JIFs during the process of research assessment.”

The proposal advocated for downplaying the JIF in favor of a method that generates the citation distribution of the individual articles in a journal. Authors believe that wider awareness and use of citation distributions can be used to better gauge a journal’s true impact.

Other scientific groups have also come up with alternatives to the JIF. At a 2015 meeting of the Royal Society in the United Kingdom, for example, members proposed that scientific journals begin to collaborate with one another and to use the database from Thomas Reuters to calculate their own JIFs in order to increase transparency.

One of the most detailed alternatives to the JIF is outlined in the Leiden Manifesto, published in the journal Nature in 2015. The Manifesto outlines ten principles to guide research evaluation and promotes a combination of both quantitative and qualitative metrics to help more effectively judge a journal’s quality. Some of the principles of this manifesto include:

  • Preserving the integrity of localized or regional research. The bias of the JIF towards English language journals has caused local research to become more problematic for journals published in other languages, especially since the impact factor is often tied to funding. The writers of the Leiden Manifesto believe it is important to preserve studies of significant localized issues, such as the study of HIV in sub-Saharan Africa.
  • Promoting the transparency of the process used to collect and analyze data. The authors believe that these statistical methods should have clearly stated rules and requirements that researchers and other stakeholders are aware of in order to promote its transparency.
  • Evaluating the quality of individual researchers based on portfolios. The authors of this manifesto also believe that instead of using a simple metric as a proxy of an individual researcher’s quality, a more qualitative assessment format like a portfolio should be used to judge experience, skills, and knowledge.
  • Allowing for independent analysis of bibliometric studies. The Leiden Manifesto also points out the need for independent review of these metrics and notes that allowing those included in these studies to confirm results as well as using third-party audits will help keep them verifiable.

 

JIFs have been a prominent means of ranking journals for decades. However, the scientific community is starting to replace this methodology in order to better quantify and capture a journal’s true impact.

TAGS

SHARE THIS POST

Medmeme Blog Footer banner
Medmeme Logo
1359 Broadway, Suite 600
New York, NY 10018
Ⓒ Medmeme, LLC 2017. ALL RIGHTS RESERVED