SciELO - Scientific Electronic Library Online

vol.90Los medicamentos falsificados en internet y el proyecto europeo Fakeshare: experiencias y actuaciones en EspañaEl impuesto sobre bebidas azucaradas en España índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados




Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google


Revista Española de Salud Pública

versión On-line ISSN 2173-9110versión impresa ISSN 1135-5727

Rev. Esp. Salud Publica vol.90  Madrid  2016  Epub 07-Jun-2021



Metrics in academic profiles: a new addictive game for researchers?

Enrique Orduna-Malea1  , Alberto Martín-Martín2  , Emilio Delgado López-Cózar2 

1 Universitat Politècnica de València. Valencia. España.

2 Universidad de Granada. Granada. España.


This study aims to promote reflection and bring attention to the potential adverse effects of academic social networks on science. These academic social networks, where authors can display their publications, have become new scientific communication channels, accelerating the dissemination of research results, facilitating data sharing, and strongly promoting scientific collaboration, all at no cost to the user.

One of the features that make them extremely attractive to researchers is the possibility to browse through a wide variety of bibliometric indicators. Going beyond publication and citation counts, they also measure usage, participation in the platform, social connectivity, and scientific, academic and professional impact. Using these indicators they effectively create a digital image of researchers and their reputations.

However, although academic social platforms are useful applications that can help improve scientific communication, they also hide a less positive side: they are highly addictive tools that might be abused. By gamifying scientific impact using techniques originally developed for videogames, these platforms may get users hooked on them, like addicted academics, transforming what should only be a means into an end in itself.

Keywords: Bibliometrics; Academic Profiles; Addiction; Gamification; Social networks; Video games; Adverse effects; Research. ethics; research Behavior; addictive


The number of researchers who use academic profiles and social networks is rapidly increasing.1,2 Since the launch of Aminer in 2006, a pioneer academic profile service, many other actors have released their own platform, among others: ResearcherID, ResearchGate, and in 2008, Microsoft Academic Search's Profiles in 2009, ImpactStory in 2011, Google Scholar Citations, and ORCID in 2012, the new Scopus Author Profiles in 2014, and recently, the profiles available in Semantic Scholar, a promising new academic search engine developed by the Allen Institute for Artificial Intelligence and launched on November 2015. These platforms have already started shaping science communication, and they may very well also influence future academic impact assessment. Their current number of users (Table 1) bears witness to their widespread acceptance among the global community of researchers.

Table 1 Multidisciplinary academic platforms: documents and profiles 

Academic profile Platforms Documents Profiles
Google Scholar 200,000,000 1,500,000
ResearchGate 100,000,000 10,000,000
Microsoft Academic Search 80,000,000 Deprecated
Mendeley 114,000,000 4,000,000 14,983,516 41,531,184

All approximate data as of September 2016; data were obtained directly from the official information published by the platforms, with the exception of Google Scholar, whose documents and profiles were estimated through year queries and direct scraping respectively.

Each of these platforms, while providing the common synchronous and asynchronous social network features, also specializes to fit their users' interests. One of the most common features in academic social networks is enabling users to upload and share their academic contributions, whether or not they have been formally published and regardless of their typology and source. Additionally, they also make a handful of social tools available to their users (Figure 1), such as personalized alerts, open peer reviews, social networking through contacts, the possibility to make and answer academic-related questions, public and private messages, and last but not least, access to a comprehensive monitoring and technological surveillance system. In short, these platforms are a new way to communicate academic activities. They also speed up the dissemination of results, facilitate research data sharing, and encourage widespread scientific collaboration, all at no cost to the user (so far).

Figure 1 New communication features in the new academic platforms (example of some ResearchGate features) 

Most academic profile services and social networks offer a wide battery of author-level metrics (Figure 2), which they usually showcase prominently on their interfaces. These may be divided into five categories: bibliometrics (publication and citation), usage, participation, rating, and social connectivity.3 All user interactions (views, downloads, reads, links, shares, mentions, reviews, embeds, labels, discussions, bookmarks, votes, follows, ratings, citations, etc.) are tracked by the platform and transformed into a variety of indicators, from which researchers can get an idea of the impact their work is having in the scientific and professional communities, and the media at large, nearly on real time. The impact reflected, of course, depends on the degree to which users engage in the platform, and the variety of metrics available. Authors may have a different reflection in each of the platforms. Thus, each platform may be considered an "academic mirror".

Figure 2 Author metric display in the most important academic profiles (Mendeley, ResearchGate, ResearcherID, and Google Scholar Citations) 

Policy makers, in their eagerness to find objective quantitative measures that relieve them of the responsibility of their decisions, may be tempted to use and endorse these metrics.4 We already know how sensitive scientists are to evaluative policies inasmuch they affect the promotion and reward systems, essential cogwheels in the clockwork of Science.5 The consecration of bibliometric indexes, with the Journal Impact Factor at the head, as the preferred criteria in the evaluation of scientists in Spain since 1989 constitutes a good case in point regarding behavioural changes induced by these policies.5,6

The compulsive obsession to use the Journal Impact Factor as a unique and indisputable measure of the quality of a scientific work quickly spread throughout many countries, giving rise to a new real disease: impactitis7, which recently led both to a declaration against its use (DORA: San Francisco Declaration on Research Assessment)8 and a Manifesto9 declaring best practices for the fair use of bibliometric indicators.


Although these "mirrors" come loaded with metrics for nearly everything, they might also bring about negative effects. They are highly addictive tools that might be abused as if they were drugs.

The first recognizable "bibliometric drug" -as we understand it today- was the Journal Impact Factor. Other metric-based drugs such as the h-index and all its derivatives came later, and now a new generation of synthetic and designer drugs has sprung from academic social networks. These new narcotics, as their predecessors, thrive on satisfying their users' egotistical needs by continuously activating their internal reward mechanisms, like any other addictive drug would do. However, the substance has evolved from one metric to an entire entertaining and immersive environment, similar to a videogame.

By gamifying researchers' impact through persuasive videogame techniques (scores, achievements, competition, unlocked features, and coming soon stages, enemies and extra lives), these platforms intend to get users hooked on scoring reputation "points", competing against one another and against themselves.10

Addict scholars will not only be more willing to game the system11, but will also find themselves in such a state of dependence and self-absorption that their creativity and productivity (professional dimension), and social relations (personal dimension) might be severely affected. Users may even experience episodes of depression if they feel their metrics are not as good as expected, or when a rival surpasses them or achieve a particularly high score.

The appearance of a new mental disorder (similar as those detected on young people hooked on social networks) should not be discarded. Researchers compulsively accessing to academic social platforms anytime and anywhere, expecting new downloads, citations or likes is a clear symptom that a new academic illness is born, a scholar-ache.

Faced with this scenario should we warn or prevent scientists against the (ab)use of these platforms? Should research institutions learn how to treat this new social disease?


1. Van-Noorden R. Online collaboration: Scientists and the social network. Nature.2015;512(7513):126-9. [ Links ]

2. Kramer B, Bosman J. 101 Innovations in Scholarly Communication: the Changing Research Workflow [online]. (Consultado 11-09-2016). Disponible en: https://101innovations.wordpress.comLinks ]

3. Orduna-Malea E, Martín-Martín A, Delgado López-Cózar E. The next bibliometrics: ALMetrics (Author Level Metrics) and the multiple faces of author impact. El profesional de la información.2016;25(3):485-96. [ Links ]

4. Jiménez-Contreras E, Moya Anegón F, Delgado López-Cózar E. The evolution of research activity in Spain: The impact of the National Commission for the Evaluation of Research Activity (CNEAI). Res Policy. 2003;32(1):123-42. [ Links ]

5. Jiménez-Contreras E, Delgado López-Cózar E, Ruiz-Pérez, R, Fernández VM. Impact-factor rewards affect Spanish research. Nature. 2002; 417(6892):898. [ Links ]

6. Delgado-Lopez-Cozar E, Ruiz-Pérez R, Jiménez-Contreras E. Impact of the impact factor in Spain. BMJ 2007; 334. [ Links ]

7. Camí J. Impactolatria, diagnóstico y tratamiento. Med Clín (Barc).1997; 109: 515-24. [ Links ]

8. American Society for Cell Biology. San Francisco declaration on research assessment [online]. (Consultado 15-09-2016). Disponible en: ]

9. Hicks D, Wouters P, Waltman L, Rijcke SD, Rafols I. Bibliometrics: The Leiden Manifesto for research metrics. Nature. 2015;520:429-31. [ Links ]

10. Hammarfelt B, de Rijcke SD, Rushforth AD. Quantified academic selves: the gamification of research through social networking services. Information Research.2016;21(2). (Consultado 11-09-2016). Disponible en: ]

11. Delgado López-Cózar E, Robinson-García N, Torres-Salinas D. The Google Scholar experiment: How to index false papers and manipulate bibliometric indicators. Journal of the Association for Information Science and Technology.2014;65(3):446-54. [ Links ]

Suggested citation:Orduna-Malea E, Martín-Martín A, Delgado López-Cózar E. Metrics in academic profiles: a new addictive game for researchers? Rev Esp Salud Publica. 2016 Sep 22;90:e1-5.

Received: September 21, 2016; Accepted: September 21, 2016

Correspondence Emilio Delgado López-Cózar Universidad de Granada Campus Cartuja s/n 18071 Granada Spain

Creative Commons License Este es un artículo publicado en acceso abierto bajo una licencia Creative Commons