SciELO - Scientific Electronic Library Online

 
vol.16 número2Impact of printed antimicrobial stewardship recommendations on early intravenous to oral antibiotics switch practice in district hospitals índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


Pharmacy Practice (Granada)

versión On-line ISSN 1886-3655versión impresa ISSN 1885-642X

Pharmacy Pract (Granada) vol.16 no.2 Redondela abr./jun. 2018

https://dx.doi.org/10.18549/pharmpract.2018.02.1282 

Editorial

Differences and similarities between Journal Impact Factor and CiteScore

Fernando Fernandez-Llimos (orcid: 0000-0002-8529-9595)1   

1PhD, PharmD, MBA. Editor-in-chief, Pharmacy Practice. Lisbon (Portugal).

Institute for Medicines Research (iMed.ULisboa), Department of Social Pharmacy, Faculty of Pharmacy, Universidade de Lisboa. Lisbon (Portugal)

ABSTRACT

Two major journal-based metrics are in competition: the Journal Impact Factor and CiteScore. Although these two metrics are based on similar principles of measuring the impact by citations, some differences between them exist. Years used to calculate the metric, access to computing data, and number of journals covered are some of these differences. One of the most relevant differences for pharmacy journals is the recognition in CiteScore of Pharmacy as an independent Subject Area, whereas it appears to have merged with pharmacology in the Journal Citation Reports. The immediate consequence of this differentiation is that pharmacy journals remain in the third and fourth quartiles in the Journal Impact Factor distribution, while a true quartile distribution exists in CiteScore.

Key words: Periodicals as Topic; Databases; Bibliographic; Journal Impact Factor; Publication Bias; Pharmacy

Journals are usually evaluated by means of citation metrics. These metrics are highly criticized because of the lack of correlation between their values and the real importance of the articles.1 Alternative metrics have not improved this situation, likely because these alternative metrics do not measure the impact of science on science but instead measure the impact of science on social media.2,3 It is also very important to remember that the General Recommendation of the San Francisco Declaration advises that people “not use journal-based metrics ...//... to assess an individual scientist’s contributions...”.4

Journal Impact Factor is a well-known citation metric that was created in the 1950s.5 The Journal Impact Factor is published through the Journal Citation Reports and is calculated from data compiled in the Web of Science database, thus covering approximately 11,000 journals with an indexed 2.2 million articles.6 Elsevier launched a new citation metric in 2016: CiteScore.7 CiteScore is calculated with the 22,800 journals indexed in Scopus, which contains approximately 70 million articles.8 In addition to these figures, there are other differences between these two journal-based citation metrics.

Both metrics are based on similar principles: the number of citations received by a journal in a given year to papers published in a given period of time, over the number of papers published by that journal in that time period. A primary difference between these two metrics is the period of time for the calculation; while the Journal Impact Factor calculates the metric using the two previous years as a basis for the citation count, CiteScore uses a three-year period. The time period to compute the citations received is relevant because of the different citation half-lives between scientific disciplines. Areas such as Immunology or Genetics and Molecular Biology cite a substantially greater proportion of articles in the two-year window than papers in Arts and Humanities or Social Sciences.9 Citations received for a paper published beyond the computing window are ignored in these two metrics. This means that papers published in 2018 will not influence the Journal Impact Factor citing papers published in or before 2015 or the CiteScore citing articles published in or before 2014. While “today’s newspaper wraps tomorrow’s fish”, should we support the idea that today’s articles will be ignored in two or three years? Journal Citation Reports provided a partial solution with the 5-year Impact Factor, which computes a five-year window. This computing window may not be sufficient for pharmacy, with a cited half-life of more than 7.5 years (e.g., Am J Health Syst Pharm 7.8; J Am Pharm Assoc 9.3).

The update frequency is something the two metrics have in common; both metrics are calculated and published yearly. A component in favor of CiteScore is the CiteScore Tracker, which is a monthly release of a provisional calculation. Publication frequency should not be synonymous with potential modifications in either metric. CiteScore cannot modify the data published until the next publication, even when an error is identified.

A major difference between the two metrics is the transparency in their calculations. Journal Impact Factor was blamed as being opaque because the calculations are “based on hidden data”.10 The quality of the data used to calculate the Impact Factor was criticized even by high Impact Factor journals.11 The most intriguing situation is why one cannot easily access the complete list of cited and citing records used to calculate a given Impact Factor, even with full access to the Web of Science. This issue was solved in CiteScore, since citing and cited documents are available through a one-link distance for a Scopus-subscribing institution or individual.

In addition to the lack of transparency, another major difference between the two metrics is the famous denominator, which is the number of articles published by the journal in the given period of time. It might sound strange that scientometricians have difficulties identifying the number of articles published by a journal in a year. This occurs when we add the adjective “citable” to the term “item”. What is a citable item? As far as we know, journals’ instructions to authors do not prevent citing any kind of contribution. Since the 1980s, Journal Impact Factor calculations have defined the denominator by using an algorithm to identify what they called “meaty” items that “contain substantive research”. This algorithm is based on a number of criteria to allocate “points” to the contribution, as a proxy for “the amount and type of information the article contains”. Regardless of the contribution’s content, the number of authors, references and pages are used to identify what counts in the denominator of the Impact Factor calculations. The consequence of this correction to the simple calculation is that journals’ publishing policies have a tremendous impact on Impact Factor.12 Let us use The Lancet as an example. As published in the 2016 Journal Citation Reports, The Lancet published 309 citable items in 2015 and 271 in 2014 and was cited 13,983 and 13,759 times by articles published in 2016, resulting in a 47.831 Impact Factor. Figures are different if we search PubMed for articles published in The Lancet: 1,992 articles for 2015 and 1,770 for 2014. The Impact Factor would be 7.374 if we use these alternative data. In contrast, CiteScore includes all the documents published in a journal. Data for The Lancet in the 2016 CiteScore showed 5,886 articles (for the three-year window) that received 40,789 citations, resulting in a 6.93. There are more articles to cite but also fewer citations received, which reinforces the lack of transparency of the previously mentioned Journal Impact Factor.

Journal coverage is another difference between the two metrics. Almost 11,000 journals compared to almost 23,000 journals are figures that speak for themselves. The Web of Science bases their “why to be selective” on Bradford’s Law.13 Journal restriction could be a reasonable procedure in the 1950s, when computers and computing power had serious limitations. Limiting the number of journals covered due to the reason “this is a well-covered category” is not acceptable in the “big data age”. What does “a well-covered category” mean? The 2016 Journal Citation Reports comprises 228 Subject Categories with a median of 62 journals per category. Should Andrology be considered a “well-covered category” with 5 journals, when Economics has 347 journals? Despite the greater coverage of CiteScore, journal selection criteria are also not perfect. Apart from several quality-related criteria (e.g., journal policy, content, regularity), Scopus uses other criteria to re-evaluate journals. Among these criteria, citation rate or clicks on scopus.com (all of them if lower than 50% of the average in the field) would be reasons to be excluded from the Scopus catalog and subsequently from the CiteScore analysis.14

This leads to another major difference between the two metrics that carries enormous relevance for our research area: the definition of Subject Area, as named by CiteScore, or Subject Category, following the Journal Citation Reports terminology. The definition of sister journals is crucial because bibliometric indexes only can be compared among analogous components by using the quartile distribution as a simple metric. To do this, each of these metrics classifies the journals among categories that represent areas of knowledge and research. The Journal Citation Reports includes pharmacy journals in the Subject Category of Pharmacology & Pharmacy, which comprises 257 journals ranging from a 57.000 to a 0.035 Impact Factor in the 2016 edition. The Pharmacology & Pharmacy Subject Category was criticized as heterogeneous because it comprises three different research areas (i.e., basic pharmacology, clinical pharmacology and pharmacy) with a very different number of journals in each area.15 Additionally, journals such as Res Soc Admin Pharm are classified under different subject categories (indexed in two categories: Public, Environmental & Occupational Health; and in Social Sciences, Biomedical). Conversely, CiteScore and Scopus created a specific Pharmacy Subject Area, which includes 24 journals ranging from a 3.14 to a 0.00 CiteScore in the 2017 edition. Unfortunately, CiteScore also presents inconsistencies in the definition of the Subject Area. J Pharm Anal, an Elsevier journal with a declared scope about “all aspects of pharmaceutical analysis”, is classified under six categories, including Pharmacy. However, Am J Health Syst Pharm is classified under Health Policy and under Pharmacology, or Eur J Hosp Pharm Sci Prac is classified under General Pharmacology, Toxicology and Pharmaceutics, but none of them are classified under Pharmacy. Unfortunately, due to the yearly update, these errors will not be corrected until the 2019 release of the 2018 CiteScore, and only then if we are able to convince Scopus managers. It seems that pharmacy practice researchers should devote efforts to describe clearly what pharmacy practice research is, and prioritizing pharmacy journals for their publications could be a required first step.

As a consequence of the definition of the Pharmacy Subject Area in CiteScore, and the correction of the inappropriate classification of the journals in previous years, Pharmacy Practice became a first quartile journal in the Pharmacy subject area (Table 1).16 This is not a merit of the Editorial Board, but the combined effort of the authors, reviewers and citers in what we called a collaborative publishing project.17 So, thank you very much indeed.

Table 1. Pharmacy Subject Area in 2017 CiteScore 

Rank order Journal CiteScore Quartile 2016 Impact Factor
#1 Journal of Pharmaceutical Analysis* 3.14 1st quartile -
#2 Research in Social and Administrative Pharmacy 2.18 1st quartile 2.403
#3 Journal of managed care & specialty pharmacy 2.11 1st quartile 1.114
#4 International Journal of Clinical Pharmacy 1.58 1st quartile 1.555
#5 Pharmacy Practice 1.35 1st quartile -
#6 International Journal of Pharmacy Practice 1.02 1st quartile -
#7 Journal of the American Pharmacists Association 1.01 2nd quartile 1.241
#8 GaBI Journal 0.98 2nd quartile -
American Journal of Health-System Pharmacy** 0.97 1.969
#9 Canadian Pharmacists Journal 0.87 2nd quartile -
#10 Currents in Pharmacy Teaching and Learning 0.67 2nd quartile -
#11 Canadian Journal of Hospital Pharmacy 0.56 2nd quartile -
#12 Hospital Pharmacy 0.40 2nd quartile -
European Journal of Hospital Pharmacy** 0.37 0.718
#13 Pharmacy Education 0.29 3rd quartile -
#14 Journal of Pharmacy of Istanbul University 0.28 3rd quartile -
#15 Journal of Pharmacy Practice and Research 0.21 3rd quartile -
#16 Farmatsija 0.15 3rd quartile -
#17 Regulatory Rapporteur 0.08 3rd quartile -
#18 Clinical Pharmacist 0.07 4th quartile -
#18 Klinicka Farmakologie a Farmacie 0.07 4th quartile -
#18 Revista Cubana de Farmacia 0.07 4th quartile -
#21 U.S. Pharmacist 0.06 4th quartile -
#22 Australian Journal of Pharmacy 0.02 4th quartile -
#23 Pharmacy Times 0.01 4th quartile -
#24 Journal of the Malta College of Pharmacy Practice 0.00 4th quartile -
Journal of Research in Pharmacy Practice N/A -
Journal of Advanced Pharmacy Education and Research N/A -
European Journal of Parenteral and Pharmaceutical Sciences N/A -
Bulletin of Faculty of Pharmacy, Cairo University N/A -

*Should not be included in this category;

**Should be included in this category

References

1 Borchardt R, Moran C, Cantrill S, Chemjobber, Oh SA, Hartings MR. Perception of the importance of chemistry research papers and comparison to citation rates. PLoS One. 2018;13(3):e0194903. doi: 10.1371/journal.pone.0194903 [ Links ]

2 Hall N. The Kardashian index: a measure of discrepant social media profile for scientists. Genome Biol. 2014;15(7):424. doi: 10.1186/s13059-014-0424-0 [ Links ]

3 Bornmann L, Haunschild R. Do altmetrics correlate with the quality of papers? A large-scale empirical study based on F1000Prime data. PLoS One. 2018;13(5):e0197133. doi: 10.1371/journal.pone.0197133 [ Links ]

4 San Francisco Declaration on Research Assessment. Available at: https://sfdora.org/read/ (accessed Jun 1, 2018). [ Links ]

5 Garfield E. Citation indexes for science; a new dimension in documentation through association of ideas. Science. 1955;122(3159):108-111. [ Links ]

6 Journal Citation Reports. https://clarivate.com/products/journal-citation-reports/ (accessed Jun 1, 2018). [ Links ]

7 CiteScore: a new metric to help you track journal performance and make decisions. https://www.elsevier.com/editors-update/story/journal-metrics/citescore-a-new-metric-to-help-you-choose-the-right-journal (accessed Jun 1, 2018). [ Links ]

8 Scopus Content at-a-glance. https://www.elsevier.com/solutions/scopus/content (accessed Jun 1, 2018). [ Links ]

9 Lancho-Barrantes BS, Guerrero-Bote VP, Moya-Anegón F. What lies behind the averages and significance of citation indicators in different disciplines? J Inf Sci. 2010;36(3):371-382. doi: 10.1177/0165551510366077 [ Links ]

10 Rossner M, Van Epps H, Hill E. Show me the data. J Cell Biol. 2007;179(6):1091-1092. doi: 10.1083/jcb.200711140 [ Links ]

11 Errors in citation statistics. Nature. 2002;415(6868):101. doi: 10.1038/415101a [ Links ]

12 Liu XL, Gai SS, Zhou J. Journal Impact Factor: Do the Numerator and Denominator Need Correction? PLoS One. 2016;11(3):e0151414. doi: 10.1371/journal.pone.0151414 [ Links ]

13 Journal Selection Process. https://clarivate.com/essays/journal-selection-process/ (accessed Jun 1, 2018). [ Links ]

14 Content Policy and Selection. https://www.elsevier.com/solutions/scopus/content/content-policy-and-selection (accessed Jun 1, 2018). [ Links ]

15 Minguet F, Salgado TM, Santopadre C, Fernandez-Llimos F. Redefining the pharmacology and pharmacy subject category in the journal citation reports using medical subject headings (MeSH). Int J Clin Pharm. 2017;39(5):989-997. doi: 10.1007/s11096-017-0527-2 [ Links ]

16 Health Professions: Pharmacy. https://www.scopus.com/sources?subject=pharmacy&asjcs=3611 (accessed Jun 1, 2018). [ Links ]

17 Fernandez-Llimos F. Collaborative publishing: the difference between 'gratis journals' and 'open access journals'. Pharm Pract (Granada). 2015;13(1):593. doi: 10.18549/PharmPract.2015.01.593 [ Links ]

Received: June 05, 2018; Accepted: June 14, 2018; pub: June 14, 2018

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY-NC-ND 3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.