Press "Enter" to skip to content

Journal impact factor gets a sibling that adjusts for scientific field

B. DOUTHITT/SCIENCE

Critics have long bashed Clarivate Analytics’s journal impact factor, complaining that the metric, which reports average citations per article, has methodological flaws that support misleading comparisons of journals and researchers. Today, the company unveiled an alternative metric that improves on some of these flaws by allowing more accurate comparisons of journals in different disciplines.

The new Journal Citation Indicator (JCI) accounts for the substantially different rates of publication and citation in different fields, Clarivate says. But the move is drawing little praise from the critics, who say the new metric remains vulnerable to misunderstanding and misuse.

The announcement comes as part of the company’s 2021 release of its Journal Citation Reports database. It includes the latest journal impact factors and other journal analytics. Among these is the new JCI, which averages citations gathered by a journal over 3 years of publications, compared with just 2 years for the impact factor. What’s more, Clarivate says the JCI includes journals not covered by the impact factor, including some in the arts and humanities, as well as regional journals or those from “emerging” scientific fields. 

The JCI is “a step forward” and “better late than never,” says Henk Moed, a bibliometrician and editor-in-chief of the journal Scholarly Assessment Reports. Its main advance, he explains, is not new at all: For decades, researchers in bibliometrics have been developing methods to compare citation impact across disciplines. For instance, where math papers might cite just a handful of sources, biochemistry papers commonly have citation lists with dozens or even hundreds of entries. So, “It’s not a sign of quality that biochemistry papers are cited more,” Moed says. Impact factors, which simply total up citations without accounting for the norm in a field, miss this fact.

For that reason, in 2010, Moed developed the methodology for a different metric—the Source Normalized Impact by Paper (SNIP)—that was adopted by a Clarivate competitor, the publishing giant Elsevier, in its Scopus bibliometric database.

Now, Clarivate’s JCI, which uses a different methodology, provides a metric similar to the SNIP for the journals in its Journal Citation Reports database. That will strengthen Clarivate’s position in the market, Moed says.

But Clarivate’s announcement leaves a lot to be desired, including transparency, says Marianne Gauffriau, a research librarian at Copenhagen University Library. The company’s white paper describing the new metric does not cite any of the substantial literature published by bibliometricians over the years, she says. The problem extends further than just giving credit where it is due, because the lack of citation makes it impossible to know what past lessons and research Clarivate has baked into the JCI: “There are a lot of people working to try to improve these indicators,” she says. “You should use that knowledge.”

There is also the risk that, like the impact factor, the JCI will be used inappropriately, Gauffriau says. Frequently, evaluators for tenure awards and other decisions use such metrics to judge the scholarly output of researchers, institutions, and individual publications—a practice that is often criticized by bibliometricians as a flawed way to judge quality.

Clarivate has tried to head off misuse of the JCI from the start, says Martin Szomszor, director of the company’s Institute for Scientific Information. It has made clear in describing the JCI that it’s not designed for assessing individual researchers. “If you’re using this in a research evaluation setting, this is probably a bad thing,” he says. “Don’t do it.” He says the intended use is for those who own portfolios of journals—including publishers, university presses, and society journals—to see how their journals stack up across fields. But, Moed counters, the analytics industry should put more effort into explaining the limits of the indicators they offer.

The JCI is unlikely to elbow out the widely used journal impact factor any time soon, Szomszor says. Clarivate will wait to see how widely the metric is adopted, giving it a chance to develop as a parallel option.

But despite its many strong points, the JCI may not, in fact, gain much traction, says publishing consultant Phil Davis. Based on the lukewarm uptake of other, similar metrics—like Elsevier’s SNIP—the new indicator could have a difficult path ahead, he says: “I believe the JCI will be largely ignored, like SNIP.”


Source: Science Mag