Overview

The visibility of scientific articles plays an important role in how discoveries are disseminated, and how they impact the field by shaping current thinking and future priorities. Academic journals have been major drivers of article visibility and impact since the 17th century. While the number of mentions an article receives in the scientific literature—reflected by citation counts—is one obvious metric of article impact, over the last three decades impact factor (IF) has emerged as the most common metric used to evaluate the quality of journal content and the impact of individual articles and their authors. The IF metric now has a major influence in many fields of science, including those germane to the mission of the American College of Neuropsychopharmacology (ACNP).

Since 1975, IFs have been calculated annually for journals listed in Clarivate Analytics’ Journal Citation Reports. A journal′s IF reflects the average number of citations that articles recently published within that journal have received. For a given year, the IF is calculated by dividing the total number of citations received in that year for articles published in the two previous years (numerator) by the total number of articles (source materials) published within those 2 years (denominator). As an example, the 2017 IF for Neuropsychopharmacology (NPP) was 6.544, which indicates that articles published in 2015 or 2016 received an average of 6.544 citations in 2017. The NPP impact factor is a blended representation of source materials that appear in NPP as well as those that appear in Neuropsychopharmacology Reviews (NPPR), a special issue that is published once per year and features thematically linked review articles. The NPP “master brand” includes both of these publishing venues.

The IF metric is proprietary, and some elements of the calculation are not of public knowledge. As an example, the definition of what is counted as an article—more formally, what qualifies as source material for inclusion in the denominator—remains somewhat opaque. A general assumption is that short articles that lack abstracts and have small numbers of references (e.g., Editorials, Research Highlights, News and Views) are not counted as source materials in the denominator, even though the citations are counted in the numerator. As such, these “front-half” materials are often considered to be a way of diversifying content, and thus are becoming increasingly prominent features for many journals. The strengths and limitations of IF as a metric and its influence on investigator career trajectories have emerged as the focus of considerable debate in ACNP-related fields (e.g., https://www.ncbi.nlm.nih.gov/pubmed/30647909), but these topics are beyond the scope of this article.

Altmetrics—a term proposed in 2010—has emerged as a newer and less traditional metric that can be used to reflect an article’s impact. Altmetrics scores are calculated for individual articles (rather than the journal that published the article); scores are based upon an automated (and proprietary) algorithm that weighs online attention derived from a variety of sources. These sources range from news platforms to blogs, Twitter, Facebook, Wikipedia, F1000, YouTube, Reddit, LinkedIn, Google + , and others. The contributions of various sources are assigned weights based on attention indices, and can be further tiered based on notoriety; in the case of news outlets, for example, a mention in “The New York Times” will increase a score more than a mention in “The Marijuana High Times”. The contributions of other sources, such as Wikipedia, are static, such that mentions will always increase a score by a fixed amount (e.g., in the case of a mention in Wikipedia, the Altmetrics score will increase by 3). While still evolving, Altmetrics can be conceptualized as representing a movement toward evaluating the actual usage of an individual article, whereas IF is less reflective of the usage of an individual article and instead indicates how collections of articles published in a particular journal perform as a unit.

In recognition of the growing importance of comprehensively quantifying the performance of investigators, their articles, and the journals in which they are published, NPP recently embedded article metrics on its individual article webpages. NPP also established a Social Media Editor role in 2017 to promote journal content, engage the NPP readership and ACNP community, and nurture dialogs among the journal, investigators, and the lay community.

In the case of NPP, relationships among Altmetrics scores, social media engagement, and citation counts are still nascent and have not been explored in depth. Here, we investigated whether Altmetrics scores—or the online attention they track—predict or otherwise affect citation counts. As a starting point in our efforts to characterize these relationships quantitatively, we examined content published in print in NPP in the year 2017, which is the most recent year for which a full data set is available while allowing sufficient time for the citation count to mature (i.e., all papers have had at least 1 full year from the time of publication in print to accumulate citations). Our goal was to determine whether Altmetrics or some of the individual sources that drive it are influential in promoting article visibility and citations, in order to guide strategies for strengthening the journal and maximizing readership engagement.

Methods

Data analysis was restricted to NPP content published in print in the calendar year 2017. The data were collected using two procedures that were designed to provide maximal insight into the relationship between online attention (defined here as Altmetrics scores, mentions on Twitter, or mentions in the news media) and citation counts. For the first procedure, articles were sorted by Altmetrics scores. Specifically, these online attention metrics were collected for the top 50% highest Altmetrics-scoring NPP articles. The investigation was limited to the top 50% (118 of 235 articles), because Altmetrics scores and citation counts trended toward 0 for articles below this limit, potentially introducing data skew that could confound our analyses. In addition, the top 50% is a data set that could be collected for journals of any size—as opposed to the top 100 articles, for example—enabling year-to-year tracking within NPP or comparisons with other journals. Individual article names were subsequently cross-referenced to the Institute for Scientific Information (ISI) Web of Science search engine (http://www.webofknowledge.com) to determine their total number of citations.

In a second, separate procedure, we instead sorted by the top 50% most-cited articles published in 2017, in recognition that there might not be complete (or even substantive) overlap between the two populations of sorted scores. Using the ISI Web of Science search engine, the top 50% most-cited NPP articles were identified, and then individual article names were cross-referenced to the NPP search feature (https://www.nature.com/npp/#search-menu) to collect information on online attention metrics (Altmetrics scores, Twitter mentions, and news media mentions).

In the initial iteration of the collection processes, only articles qualifying as source materials (i.e., original content for which an abstract was available) were included. The data for NPP “front half” content (i.e., content which is not associated with an abstract, including Editorials, Perspectives, Commentaries, Research Highlights, and “Hot Topics” pieces—36 in total) were then collated separately using identical procedures. Altmetrics scores and citation counts were harvested within a 24-h time window, since these metrics are constantly accumulating over time.

For both sorting procedures, Pearson’s product-moment correlation was applied to quantify relationships between Altmetrics scores, Twitter mentions, news media mentions, and citation counts.

Results

Citations when sorted by top 50% Altmetrics scores of NPP source materials

When source materials were sorted by Altmetrics scores, there was no statistically significant relationship between their Altmetrics score and the number of citations they received (Fig. 1a). In contrast, mentions on Twitter significantly correlated with total number of citations, accounting for ~26% of the variance in citation count (Fig. 1b). Mentions on news media platforms had no statistically significant relationship with the number of citations (Fig. 1c).

Fig. 1
figure 1

Citations and online attention metrics when content is sorted by top 50% Altmetrics-scoring NPP source materials (Original reports and Reviews) published in print in 2017. a Correlation between Altmetrics scores and the total number of citations, as identified on the ISI Web of Science. b Correlation between mentions on Twitter and the total number of citations. c Correlation between mentions on news platforms and the total number of citations

Altmetrics scores when sorted by top 50% most-cited NPP source materials

Of 118 NPP source materials examined, only 58 articles (49%) were represented in both the top 50% Altmetrics-scoring and top 50% most-cited lists. Despite this partial overlap, similar correlational relationships emerged for both data sets. There was no statistically significant relationship between Altmetrics score and citation counts when source materials were sorted by the total number of citations (Fig. 2a). However, a statistically significant relationship emerged between Twitter mentions and citation counts, such that Twitter mentions accounted for ~26% of the variance in the total number of citations (Fig. 2b). No significant relationship was observed between news media mentions and citation counts, as was the case with the other sorting procedure (Fig. 2c).

Fig. 2
figure 2

Citations and online attention metrics when content is sorted by top 50% most-cited NPP source materials (Original reports and Reviews) published in print in 2017. a Correlation between Altmetrics scores and the total number of citations, as identified on the ISI Web of Science. b Correlation between mentions on Twitter and the total number of citations. c Correlation between mentions on news platforms and the total number of citations

Citations and Altmetrics scores for NPP front half

Similar to source materials, there was no significant relationship between Altmetrics score and citations of front-half content (Fig. 3a), although the total number of citations for front-half content was far lower, overall. In contrast to source materials, there was no significant relationship between Twitter mentions and citation counts for front-half content (Fig. 3b). However, a statistically significant relationship did emerge between news mentions and the total number of citations (Fig. 3c), such that news mentions accounted for ~13% of the variance in citation counts.

Fig. 3
figure 3

Citations and online attention metrics when content is restricted to front-half NPP materials (Editorials, Perspectives, Commentaries, and Hot Topics) published in print in 2017. a Correlation between Altmetrics scores and the total number of citations, as identified on the ISI Web of Science. b Correlation between mentions on Twitter and the total number of citations. c Correlation between mentions on news platforms and total number of citations

Summary and path forward

As of 2017—a time point that might be considered still early in the uptake of Altmetrics—there is no correlation between Altmetrics online attention scores and citations for NPP content of any kind (source materials, front-half materials). There are, however, statistically significant relationships between Twitter mentions for NPP source content and citation counts, regardless of whether sorted by top 50% Altmetrics scores or top 50% most-cited articles. As is inherent in the use of correlation statistics, the direction of any cause–effect relationships cannot be established. However, two possible narratives quickly emerge: (1) Twitter mentions are predictive of citation counts that would occur regardless (i.e., no causal relationship exists), or that (2) Twitter mentions can serve as a driving force upon citation counts (i.e., a causal relationship exists). Considering that Altmetrics tend to accumulate soon after articles become available whereas time lags are required before articles become integrated into the literature and begin generating citations, the existence of a meaningful inverse relationship—that citations drive Altmetrics, or Twitter mentions—seems unlikely. Narratives (1) and (2) both seem plausible, and in fact both may be contributing to outcomes in a bi-directional manner. However, our findings raise the possibility that NPP social media activities may be increasing content visibility and citation counts, a by-product of which would be elevated journal IF.

Surprisingly, there is no correlation between news reports and citations of NPP source materials, suggesting that the main benefit of news reports is to enhance name recognition of the ACNP, its members, and its journal among the general public, as opposed to representing a feature that can affect article visibility and IF via the scientific community. In general, the influence of front-half materials seems modest, and thus these article types may be better conceptualized as ways of increasing readership engagement with the journal, cultivating loyalty, and providing venues for professional development for early career investigators. While there is a statistically significant relationship between news mentions and citations for front-half materials, the range of citations is narrow and the number of data points is small, complicating interpretation.

We acknowledge that relationships among online attention metrics—particularly social media—and citations may differ for other journals and change over time, as Altmetrics become more established. Going forward, we will continue to monitor the NPP data sets over the course of the next year, to determine if and how the passage of time impacts the relationship between online attention metrics, citations, and impact factor for individual years or between years. We anticipate that as Altmetrics and social media continue to evolve, their true influence on journal function may become clearer and easier to understand.

Funding and disclosure

All of the authors have roles at NPP: CJJ is an Editorial Intern, GNN is the Social Media Editor, and WAC is the Editor-in-Chief.