ArticlePDF Available

Altmetrics: An Overview and Evaluation

Authors:

Abstract and Figures

Purpose The purpose of this paper is to provide an overview and critique of altmetrics, an understudied yet increasingly important arena of study for scholars, academics, and professional researchers. Design/methodology/approach The paper is organized into six parts: the first defines altmetrics; the second examines how altmetrics work; the third presents multiple typologies under which altmetrics can be classified and studied; the fourth details the technological capabilities of altmetrics; the fifth presents a critical evaluation of the “pros and cons” of altmetrics; and, the sixth outlines some directions for future and ongoing research. Findings The conclusions detail the strengths and limitations of altmetrics and point toward avenues for continued research and development. Originality/value This paper is among the first to provide a substantive review and evaluation of altmetrics for academics to consider when adopting, utilizing, and researching these tools.
Content may be subject to copyright.
Altmetrics: an overview
and evaluation
Ann E. Williams
Department of Communication, Georgia State University, Atlanta, Georgia, USA
Abstract
Purpose The purpose of this paper is to provide an overview and critique of altmetrics, an understudied
yet increasingly important arena of study for scholars, academics, and professional researchers.
Design/methodology/approach The paper is organized into six parts: the first defines altmetrics; the
second examines how altmetrics work; the third presents multiple typologies under which altmetrics can be
classified and studied; the fourth details the technological capabilities of altmetrics; the fifth presents a
critical evaluation of the pros and consof altmetrics; and, the sixth outlines some directions for future and
ongoing research.
Findings The conclusions detail the strengths and limitations of altmetrics and point toward avenues for
continued research and development.
Originality/value This paper is among the first to provide a substantive review and evaluation of
altmetrics for academics to consider when adopting, utilizing, and researching these tools.
Keywords Altmetrics, Research impact, Academic networks
Paper type Research paper
Altmetrics: an overview
The primary purpose of this paper is to provide a description, overview, and evaluation of
altmetrics, an understudied yet increasingly important arena of study for scholars,
academics, and professional researchers. The paper is organized into six parts: the first
defines altmetrics and clarifies the concept of altmetrics in its various forms; the second
examines how altmetrics work; the third presents multiple typologies under which
altmetrics can be classified and studied; the fourth details the technological capabilities of
altmetrics; the fifth presents a critical evaluation of the pros and consof altmetrics; and,
the sixth outlines some directions for future and ongoing research.
What are altmetrics?
Altmetrics are measurements of how people interact with a given scholarly work. They aim
to measure web-driven scholarly interactions, such as how often research is tweeted,
blogged about, or bookmarked (Howard, 2012; Robinson-García et al., 2014). Altmetrics.org
and Altmetric.com are web-based sites that promote the use of altmetrics.
Altmetrics.org is a site created, developed, and maintained by scholars and app designers
who share a commitment to the creation and study of new metrics based on the social web for
analyzing, and informing scholarship(Priem et al., 2010; http://altmetrics.org/about).
Altmetrics.org is concerned with the promotion of altmetric apps (i.e. ImpactStory,
ReaderMeter, ScienceCard, PLoS Impact Explorer, PaperCritic, and Crowdometer).
Altmeric.com is a commercial website, partnered with major publishers, which functions
as an open tool and data provider to supply qualitative and quantitative data that
complements traditional, citation-based measurements. Altmetric.com is concerned with the
promotion and circulation of its products in connection with major academic publishers,
institutions, and funders (e.g. Taylor & Francis, Wiley, The London School of Economics,
and the Smithsonian). Online Information Review
Vol. 41 No. 3, 2017
pp. 311-317
© Emerald Publishing Limited
1468-4527
DOI 10.1108/OIR-10-2016-0294
Received 5 October 2016
Revised 5 October 2016
Accepted 3 March 2017
The current issue and full text archive of this journal is available on Emerald Insight at:
www.emeraldinsight.com/1468-4527.htm
The author would like to thank Seifu Adem for his research assistance.
311
Altmetrics: an
overview and
evaluation
The rise of altmetrics
Technological changes of the 1990s and mid-2000s, characterized by the development of the
social web and internet-based social networking, enabled librarians to make many scholarly
works more openly and widely accessible to researchers and higher education communities
(Dutta, 2016). The creation and popularization of altmetrics both the measures and the
websites is a result of the growth of communication technology, especially social networking
sites such as Facebook and Twitter. As social networking provided new opportunities for
scholars to disseminate their research, new methods for capturing and calculating the
networked impact of scholarly publications became increasingly important (Dutta, 2016).
Proponents of altmetrics identify several factors that necessitated their emergence and will
precipitate their continued use. According to Galligan and Dyas-Correia (2013), one factor is the
limitation of existing measures of social, public, and/or real-worldimpact of research. For
example, traditional measures such as bibliometrics that measure peer-review, citation count, and
journal impact factors measure only the most academically significant and theoretically relevant
material from the huge volume of scholarly literature produced (Galligan and Dyas-Correia, 2013,
p. 56). A primary factor underlying the creation and continuance of altmetrics is that traditional
citation-based measures focus solely on journals and articles, but do not account for other work
outputs such as blogs and datasets. Altmetrics, in contrast, fill this void by measuring the impact
of journal articles through social media activity, while also accounting for other forms of
significant research output that fall outside the parameters of traditional peer-reviewed
publications. In this way, altmetrics enable the discovery of new information about research
impact that was previously difficult to obtain; and, they allow researchers to discern the impact of
their work at a faster rate than traditional metrics, such as citation counts and journal impact
factors that accrue more slowly. Some even suggest that the benefits of altmetrics may be far
greater than the benefits of other metrics. Most notably, altmetrics allow for valuable forms of
crowdsourcing that tap the immediate value of research within networked environments. Galligan
and Dyas-Correia (2013) note this trend, suggesting that through crowdsourcing an articles
impact can almost immediately be assessed by multiple bookmarks and conversations(p. 57).
How do altmetrics work?
Supported by digital science, altmetrics aggregate information from different sources. Via
Altmetric.com, these include peer reviews, references on Wikipedia, public policy documents,
discussions on research blogs, mainstream media coverage, bookmarks on reference managers
like Mendeley, and social media. According to Melero (2015), Altmetric.com gathers data from
three main sources: social media, traditional media, and online reference managers such as
Mendeley. While Altmetric.com does not collect data from all online platforms, they do draw
from a wide array of sources, including blogs, news, Reddit, Facebook, Google Plus, Pinterest,
Twitter, Stack Exchange, CiteULike, Connotea, Mendeley, F1000, YouTube, LinkedIn Groups,
Research Highlights, and miscellaneous others (Robinson-García et al., 2014).
Typologies/classifications of altmetrics
As measurement tools, altmetrics are classified based on the function they provide and the type
of engagement users have with a given research output. For example, Robinson-García et al.
(2014) categorize different types of altmetrics by their primary functions: discussions, mentions,
readers, reviews, video, and citations. Sources that host discussions include blogs, news, Reddit,
and Stack Exchange. Sites such as Facebook, Twitter, Google Plus, Pinterest, and LinkedIn
Groups categorize mentions. Outlets such as CiteUlike, Connotea, and F1000 provide reviews.
YouTube illustrates impact through videos. And, Research Highlights provides full citations.
In extension of typologies based on functionality, others have conceptualized altmetrics
in terms of use (Wouters and Costa, 2012) and engagement (Lin and Fenner, 2013).
The following two tables summarize classifications by Lin and Fenner (2013) and Wouters
and Costas (2012), respectively (Tables I and II).
312
OIR
41,3
What are the technological capabilities of altmetrics?
Almetric.com assigns scores to an article by calculating how often the work is mentioned on
different media platforms. The popularity of the article is, therefore, based on how often it is
referenced in these sources. In addition to frequency of mentions, the altmetrics projected
via Altmetric.com include a record of attention, a measure of dissemination, and an indicator
of influence and impact. As a record of attention, Altmetric.com provides information on the
reach of a scholarly work, i.e., how many people discuss the piece of research. As a measure
of dissemination, Altmetric.com maps the location of the mention (where), and the reason
(why) an article has been shared and discussed. As an indicator of influence and impact,
Altmetric.com also provides a vehicle to capture how research can influence society-at-large.
In these ways, Altmetric.com has unique capabilities to measure the impact of different
research outputs, in terms of usage (downloads and views), peer-review (expert opinion),
citations, storage, links, bookmarks, and conversations.
Key functionalities
In order to track and collate the amount of attention a scholarly output receives, Altmetric.com
examines three essential components: an output ( journal article, dataset, etc.); an identifier
attached to the output (DOI, RePEc, etc.); and mentions in a source. Once the algorithm collects
mentions of an article, the site displays the information on its display page.
The display page also has several functionalities that provide information about an article.
The following are descriptions of the key functionalities that appear on the display page:
(1) The donut and attention score: this provides information on the types and the
amount of attention the research output has received.
Type of engagement Description
Viewed Accessing the article online
Saved Saving articles in online bibliography managers which helps researchers to organize papers
Discussed Discussion of the research described in an article, ranging from a short comment
shared on Twitter to more in-depth comments on blog postings
Recommended Endorsing the research article via a platform such as an online recommendation
channel
Cited Formation of a citation to an article published in a scientific journal
Source: Lin and Fenner (2013)
Table I.
Lin and Fenners
engagement-based
classification
Uses Descriptions
Diversity of channel analyzed Altmetrics enable the user to analyze different types of materials such
as books, blogs, Facebook postings, or tweets
Speed of acquiring/retrieving data Unlike traditional citation-based metrics, altmerics are instantly
available for analysis
Openness of method Altmetric data is open to download and free to use
The ability to measure impact beyond
the scholarly realm
Altmetrics accessibility to different groups of readers such as
researchers, professionals, students, and interested publics helps
research escape the judgment of the scientific readers
Source: Wouters and Costas (2012)
Table II.
Wouters and
Costas use-based
classification
313
Altmetrics: an
overview and
evaluation
(2) Summary counts: this shows how many people from each source type,
e.g., Facebook, Twitter, F1000, etc., are mentioned or shared the work.
(3) Bibliographic details: this provides information on the names of authors, titles, dates
of publication, and other identifiers.
(4) Browse the original mentions: this function enables users to get access to sources
that mention the primary research, i.e., links to news stories, blogs, etc.
(5) Signup for alerts: this asks users to register in order to receive e-mail alerts
regarding the articles they select.
(6) Access the published research: this function enables users to get access to the
research output.
The details page also provides demographics and rankings as well as a map of
the geographical locations of users that illustrates the amount of attention a
research article receives in comparison with other research outputs published within a
similar time period and topical domain. The following screenshot of the display page
shows how Altmetric.com summarizes the attention a piece of research receives from
audiences (Figure 1).
Figure 1.
Altmetric.com
data map
314
OIR
41,3
The pros and cons of altmetrics
Pros
Although altmetrics are not a replacement for traditional measures in discerning the impact of
research, they do complement traditional measures. Speed is an asset of altmetrics that
traditional measures lack. Altmetrics enable users to have a quick view of the impact of a
research output (Melero, 2015). They allow users to get fast information about how many
times an article is mentioned or discussed by people in different sources. In addition to the
quick view of a research outputs impact, altmetrics also enable the swift dissemination of an
article. Due the speed at which people use social media, such as Twitter, it is easier to follow
and organize references of scholarly work right after they are published (Melero, 2015).
Altmetrics are also advantageous in that they have a wide-range of applications that help
track researchersscholarly outputs, including data sharing, software, and presentations.
Furthermore, altmerics provide opportunities that researchers cannot get through other
measures, including discussion of works in progress and unpublished articles.
Cons
Although altmetrics are a useful data source, there are some limitations as well.
First, they are only complementary to, not a replacement for, traditional data such as
citation-based measurements.
Second, altmetrics can be compromised by subversive means or gaming, i.e., the practice
of unscrupulously exploiting a system or set of data in order to produce results that fit a users
desired outcome(Roemer and Borchardt, 2015, p. 20). This concern arises in environments
where data can be artificially manipulated. For example, one can get a Likeon Facebook
from close friends, family members, etc. to promote his/her work, which may reflect a form of
personal or public impact but may not be a valid measure of scholarly impact.
The third limitation of altmetrics concerns the lack of a correlation between bibliometric and
altmetrics data (Roemer and Borchardt, 2015). Unlike biblometrics, there is no conclusive research
evidence that indicates a correlation between altmetric indicators and citation indicators.
Fourth, Altmetric.coms inclusion of data from public social media like Facebook and
Twitter is a potential problem. According to Roemer and Borchardt (2015):
This concern leads to what is perhaps an even more relevant criticism of the inclusion of metrics
from non-academic-peer networks that networks primarily populated by members of the general
public are much less likely to be interested in esoteric fields of research than in research that
connects to popular topics of discussion like climate change.
A fifth limitation is that Altmetric.com does not currently includes all possible sources in
which a scholarly work is mentioned and may omit or misidentify scientific research
(Robinson-García et al., 2014). For example, while Altmetric.com gathers data from Twitter,
it does not include mentions in Tumblr.
A sixth limitation is that Altmetric.coms circulation data only includes impact scores for
articles published in English. For example, while Altmetric.com gathers mentions on
Facebook, it does not collect mentions on Spanish Tuneti (Robinson-García et al., 2014).
The seventh issue with Altmetic.com is related to its representativeness. For instance,
Robinson-García et al. observed that almost 95.5 percent of data gathered are derived from
only five sources, namely, Twitter, Mendeley, Facebook, CiteULike, and blogs.
The eighth limitation of Altmeric.com is a lack of clarity in the definition and
interpretation of what the metrics mean and what they do. For instance, it is difficult to
claim that mentions, recommendations by experts, reader counts, likes, and citations on
Twitter, F1000, Mendeley, Facebook, and blog posts have a common and universally
understood meaning (Haustein, 2016).
Table III summarizes some of the limitations, i.e., Cons,of altmetrics.
315
Altmetrics: an
overview and
evaluation
Directions for future research
This research report provides fundamental conceptualizations and critiques of altmetrics
that open the door for continued exploration and research in this domain. Two areas that are
of particular and immediate importance include:
(1) Technical issues. Altmetric.com, and the apps available via Altmetrics.org, do not
currently have the capability of collecting data on scholarly outputs produced in
languages other than English. In the future, design elements of these sites could be
enhanced by the inclusion of data from a broader range of sources and the creation
of apps that gather data from multilingual texts.
(2) Theoretical issues. Since the study of altmetrics is a new field, with development ongoing,
there is not a definitive theory-based explanation as to the mechanisms underlying their
emergence and continued growth. In the future, theory-driven approaches to studying
altmetrics will be of value to academic and scholarly communities.
In connection with the foundational conceptualizations of altmetrics presented in this study,
research that explores academicsadoption and use of altmetrics over-time will be essential
to uncovering the long-term impact and value that alt-measurements bring to scholarly
communities. From a user perspective, the advantages of altmetrics identified in this report
can be leveraged to heighten the dissemination and reach of academic research, particularly
in domains that provide research to the general public. From a design perspective, the
current limitations of altmetrics should be considered as new and improved tools are
developed and unveiled. The future of altmetrics is a fruitful area of study for designers,
academics, and researchers to continue to explore.
Cons Description
Not citation-based Altmetrics are only complementary to traditional citation metrics and do not
replace citation-based data such as bibliometrics
Gaming Data can be manipulated to fit a users desired outcome
Lack of significant correlation
with bibliometric data
There is no conclusive research evidence that documents a correlation
between altmetric indicators and citation-based indicators
Inclusion of public social media General publics may be less interested in academic research outputs and
more interested in popular topics
Lack of common definitions It is difficult to define activities such as mentions on Twitter, likeson Facebook,
and recommendations by experts on F1000 as sharing similar meaning
Heterogeneity of social media
platforms and users
motivations
The nature of social media platforms such as Facebook, Twitter, and F1000,
etc., host a wide array of users, with different motivations and use behaviors,
that may not be directly comparable and/or uniformly impactful
Lack of conceptual
frameworks and theories
Scholars have yet to fully theorize and conceptualize altmetrics
Data quality Unlike other measures such as bibliometrics, where data can be triangulated,
the data in altmetrics are dynamic, in that they can be deleted or altered, and
may therefore lack consistency, accuracy, and replicability
Lack of inclusiveness Altmetrics do not include data from all digital media platforms
Language bias Altmetrics.org only collects data on research that is written in English. For
example, while they collect data on Facebook, they dont collect mentions on
Spanish Tuneti
Table III.
The limitations
of altmetrics
316
OIR
41,3
References
Dutta, B. (2016), Altmetricmanifestocompletesfiveyears(2010-2015),Current Science,Vol.110No.1,p.17.
Galligan, F. and Dyas-Correia, S. (2013), Altmetrics: rethinking the way we measure,Serials Review,
Vol. 39 No. 1, pp. 56-61.
Haustein, S. (2016), Grand challenges in altmetrics: heterogeneity, data quality and dependencies,
Scientometrics, Vol. 108 No. 1, pp. 413-423, doi: 10.1007/s11192-016-1910-9.
Howard, J. (2012), Scholars seek better ways to track impact online, The Chronicle of Higher Education,
January 29, available at: www.chronicle.com/article/As-Scholarship-Goes-Digital/130482/
(accessed October 1, 2016).
Lin, J. and Fenner, M. (2013), Altmetrics in evolution: defining and redefining the ontology of article-
level metrics,Information Standards Quarterly, Vol. 25 No. 2, pp. 20-26.
Melero, R. (2015), Altmetrics a complement to conventional metrics,Biochemia Medica, Vol. 25
No. 2, pp. 152-160.
Priem, J., Taraborelli, D., Groth, P. and Neylon, C. (2010), Altmetrics: a manifesto, available at:
http://altmetrics.org/manifesto (accessed October 1, 2016).
Robinson-García, N., Torres-Salinas, D., Zahedi, Z. and Costas, R. (2014), New data, new possibilities:
exploring the insides of altmetric.com,El Professional De La Información, Vol. 23 No. 4,
pp. 359-366, doi: 10.3145/epi.2014.jul.03.
Roemer, R.C. and Borchardt, R. (2015), Issues, controversies, and opportunities for altmetrics,Library
Technology Reports, Vol. 51 No. 5, pp. 20-30.
Wouters, P. and Costas, R. (2012), Users, Narcissism and Control Tracking the Impact of Scholarly
Publications in the 21st Century, SURF-foundation, Utrecht.
Corresponding author
Ann E. Williams can be contacted at: annwilliams@gsu.edu
For instructions on how to order reprints of this article, please visit our website:
www.emeraldgrouppublishing.com/licensing/reprints.htm
Or contact us for further details: permissions@emeraldinsight.com
317
Altmetrics: an
overview and
evaluation
Reproduced with permission of copyright
owner. Further reproduction prohibited
without permission.
... Altmetrics measurements are indicative of how scholarly information is being circulated outside of the academic sphere by identifying web-based interactions (Levin et al., 2023) such as social media engagement on Twitter, email, websites, blogs, and wikis (DeMarco, 2022;Galligan & Dyas-Correia, 2013;García-Villar, 2021;Gilstrap et al., 2023). The purpose of altmetrics is to measure the impact of scholarly contributions on public audiences (Williams, 2017). The benefit of using altmetrics is that it can be used to assess impact of academic dissemination at a more expedient rate than traditional metrics (Galligan & Dyas-Correia, 2013;Williams, 2017). ...
... The purpose of altmetrics is to measure the impact of scholarly contributions on public audiences (Williams, 2017). The benefit of using altmetrics is that it can be used to assess impact of academic dissemination at a more expedient rate than traditional metrics (Galligan & Dyas-Correia, 2013;Williams, 2017). The limitation of altmetrics is that the data can be easy to manipulate (García-Villar, 2021). ...
Article
Full-text available
Through a lens for engaged scholarship (Boyer in Journal of Public Service and Outreach, 1(1), 11–20, 1996) this multiple case study (Merriam, 1996) explores the potential of scholarly podcasts for public knowledge dissemination, highlighting the misalignment of university impact metrics with this medium. Our team collected qualitative and numerical data from six podcasters across our university system. We identify metrics for assessing scholarly podcast value, offer recommendations for institutional communication, and share our insights and challenges. Data analysis suggests that a Listen Score (Listen Notes, ND) and an increasing Podcast Success Index (Singh et al. JMIR Medical Education, 2(2), 1–10, 2016) may be consistent with a wider reach. Consistent production and promotion are key and infrastructure support for scholarly podcasters is necessary.
... Thus, the creation of the RG intended to measure the impact of an individual article's dissemination pattern [3] and measures online scholarly interactions like how often research has been blogged out, twitted, or bookmarked. [4][5][6] In Indian scenario, 61% of the web of science indexed research output is available on RG, whereas in the context of physics domain, 64.5% of the total research output indexed in the web of science was found over Research Gate. [7] The emerging online tools of scholarly communication allow the researchers to spread their scholarly wings that reflect the broad, rapid impact of scholarship in the burgeoning ecosystem. ...
... The fourth RG metric examined 'Reads' , which suggests that faculty members were having the 'Reads' of their research items falls in the range of 0 to 1745469, and the mean value of 'Reads' was found to be 27525. 6 The h-index of total faculty members represented a mean value of 13.75 with a standard deviation of 13.31. Moreover, it was found that 356 faculty members (i.e., 98.61%) have at least one h-index. ...
Article
Full-text available
The present study intends to explore the overall research performance/engagement of the faculty members on the ResearchGate from the Physics discipline working at Indian Central Universities. The analyzed data revealed that 473 faculty members were found, but only 361 (i.e., 76.32%) have their profile on ResearchGate. Further analyses include the distribution of RG metrices which indicated that 98.89% and 98.06% of the faculty members had added at least one research item and at least one full-text research item over the RG, respectively. The findings stated that Kriti Ranjan from the University of Delhi secured highest ranking across all metrices, except followers and followings. The mean value for the Reads and Citations were found to be 27525.59 (std. dev. = 163029.86) and 1555.2 (std. dev. = 5838.76), respectively. In addition, RI Score exhibits a strong positive correlation with other RG-based metrices excluding the ‘following’. This study can be considered the only ResearchGate analyses that has included the working researchers of the Physics discipline by analyzing their research engagement and active presence over it.
... org/) showed a greater document coverage than GS (Gureev & Mazov, 2023). Unlike tradition citation-based metrics, Altmetrics are an alternative form of appreciating the usage of a paper, by assessing its online attention (Williams, 2017). ...
Article
A recent paper published in European Science Editing (ESE; https://doi.org/10.3897/ese.2023.e102691) assessed that journal’s citations in 38 papers published between 2020 and 2022 using Dimensions and Scopus. The rationale for that study was that Dimensions, unlike Scopus, was “free”, and thus a useful substitute for bibliometric analyses, but it is contended herein that the latter claim is not entirely accurate. To complement that analysis, citations of the same 38 papers, as well as 29 papers that had not been analyzed, giving the full complement of 67 papers published in ESE from 2020 to 2022, were reassessed (17 October 2023) using Scopus, Dimensions, and Google Scholar, since the latter is free. In addition, those papers’ attention scores were appreciated from their Altmetrics scores, which were harvested from Dimensions. Although citation trends for the 38 papers were replicated, when the Altmetric scores of the additional 29 papers were also considered, the Altmetric score for ESE increased by 221 from 633 to give a total of 854.
... One scholars-to-society measure to which we now pay more attention is Altmetric, which captures the visibility and engagement that publications receive across social media, news outlets, blogs, and other online platforms. While we recognize the concern that Altmetric data can be manipulated (Williams, 2017), and that the metric therefore requires careful interpretation, it can help us identify articles that receive disproportionate engagement outside of academia. This allows us to document the platforms and communities where they have traction and evaluate the drivers of their appeal. ...
... Once bitten, academics and professional staff alike perpetuate Pavlovian behaviors of contagious consultation. Universities are amok with staff and student surveys, satisfaction polls, feedback forms, and dubiously-sourced social media "alt metrics" (Williams, 2017). Writing about the restructure he oversaw at the University of Alberta, President Bill Flanagan emphasizes the extent of consulting as a step process: ...
Article
Full-text available
[Open Access] Universities and management consultants are locked in a danse macabre. We turn to the vampire genre to elaborate on the relationship of consulting companies to the university sector, focusing on the University of Alberta in Canada and Monash University in Australia. We are academics with long experience of the consequences of change management and the employment of consultants in universities. Deb is sufficiently “long in the tooth” that her entire career spans the period of heightened government and private sector intervention in Australian universities that began in the late 1980s and more recently she has had the experience of watching this process occur again, at speed, in Canada. Ben is a representative of the National Tertiary Education Union at Australia’s largest university. He is also an experienced journalist who has reported on Australian higher education and public policy for more than two decades. The essay argues that consultants and universities are engaged in a mutually dependent relationship designed to sustain each other at the expense of the public.
... Access the published research to get research output (Williams, 2017). ...
Conference Paper
In this paper, the researcher has provided an overview of popular softwares and tools used for measuring research productivity. The researchers can choose the most suitable tools based on their specific needs including publication tracking, citation analysis, collaboration mapping, online impact assessment, and networking opportunities.
... Lastly, our study also enables the scientific community to have a more comprehensive and objective understanding of the role of altmetric indices in the evaluation of scientific outputs. On the one hand, altmetric indices could be complementary to traditional measures by capturing the online impact of publications at high speed (Batooli et al., 2021;Williams, 2017;Zhao and Wang, 2020). Science has a broad impact on all aspects of society. ...
Article
Full-text available
Purpose-Scientific impact is traditionally assessed with citation-based metrics. Recently, altmetric indices have been introduced to measure scientific impact both within academia and among the general public. However, little research has investigated the association between the linguistic features of research article titles and received online attention. To address this issue, the authors examined in the present study the relationship between a series of title features and altmetric attention scores. Design/methodology/approach-The data included 8,658 titles of Science articles. The authors extracted six features from the title corpus (i.e. mean word length, lexical sophistication, lexical density, title length, syntactic dependency length and sentiment score). The authors performed Spearman's rank analyses to analyze the correlations between these features and online impact. The authors then conducted a stepwise backward multiple regression to identify predictors for the articles' online impact. Findings-The correlation analyses revealed weak but significant correlations between all six title features and the altmetric attention scores. The regression analysis showed that four linguistic features of titles (mean word length, lexical sophistication, title length and sentiment score) have modest predictive effects on the online impact of research articles. Originality/value-In the internet era with the widespread use of social media and online platforms, it is becoming increasingly important for researchers to adapt to the changing context of research evaluation. This study identifies several linguistic features that deserve scholars' attention in the writing of article titles. It also has practical implications for academic administrators and pedagogical implications for instructors of academic writing courses.
Article
Objective Neonatology quality improvement (QI) projects can improve the safety and value of health care, but the scholarly impact of published QI projects is unclear. We measured scholarly citation and media attention garnered by published neonatology QI projects and analyzed project or publication characteristics associated with increased impact metrics. Study Design We identified publications between 2016 and 2019 using mapping review methodology. We correlated project characteristics with measures of scholarly citation in Scopus and Google Scholar, and media attention as measured by Altmetrics. We collected Citation and Altmetric data in 2023. Results The search identified 148 eligible articles, with a median citation count of 7 based on Scopus (or 12, based on Google Scholar) and a median Altmetric score of 2. Notably, 66% of articles published in a journal with an Impact Factor (IF) had more citations per year than would be expected from the IF value. Higher scientific citations were associated with articles reporting process and cost outcomes; implementing interventions that addressed family education or organizational change; and using regression analysis. Higher media attention was associated with multicenter projects, longer intervention periods, and projects scoring higher on the Quality Improvement Minimum Quality Criteria Set (QI-MQCS) rubric. Conclusion Published neonatology QI projects are well cited in subsequent scientific publications, with the choice of project outcome, interventions, and analytic strategy influencing citation metrics. Adherence to QI-MQCS guidelines was favorably associated with media attention, but not with scholarly citations. Key Points
Article
New media platforms have enhanced the efficiency and diversity of information dissemination, providing new possibilities for the dissemination and promotion of academic papers. Currently, a large number of Chinese academic journals from different disciplines have established WeChat official accounts to promote their papers. This study examines WeChat official accounts from three disciplines: social sciences, natural sciences, and medicine. We analyze the existing paper promotion methods employed by these academic journal official accounts from four dimensions: content presentation format, number of papers promoted in a single post, interactive forms, and publishing time. The findings reveal that the current promotion methods for academic papers on WeChat official accounts are relatively limited, with low utilization of multimedia content. Therefore, there is a need for further improvement in new media promotion for academic papers.
Article
The open science movement concerns the principles of scientific growth and open access, the ultimate goal of which is to facilitate the publication and communication of scientific knowledge. It promotes sharing and collaboration by extending the elements of openness throughout the research cycle, through a set of practices aimed at more reliable science, including reanalysis of code, data and research materials; interactive and more transparent ways of graphically representing information, and using formats such as preprints and open access publishing. This article aims to present these practices and the impact they have on the future of scholarly communication in view of the intention to replace the current traditional model of scholarly publishing with fully open access. The attention is focused on the advantages of free data and research sharing, open peer review, the application of new generation metrics or altmetrics, and the replacement of traditional forms of publication by alternative options. An important part of this process is identifying the challenges associated with open science and addressing any gaps in scientists' knowledge and skills around engaging with open practices.
Article
Full-text available
Emerging metrics based on article-level does not exclude traditional metrics based on citations to the journal, but complements them. Both can be employed in conjunction to offer a richer picture of an article use from immediate to long terms. Article-level metrics (ALM) is the result of the aggregation of different data sources and the collection of content from multiple social network services. Sources used for the aggregation can be broken down into five categories: usage, captures, mentions, social media and citations. Data sources depend on the tool, but they include classic metrics indicators based on citations, academic social networks (Mendeley, CiteULike, Delicious) and social media (Facebook, Twitter, blogs, or Youtube, among others). Altmetrics is not synonymous with alternative metrics. Altmetrics are normally early available and allow to assess the social impact of scholarly outputs, almost at the real time. This paper overviews briefly the meaning of altmetrics and describes some of the existing tools used to apply this new metrics: Public Library of Science - Article-Level Metrics, Altmetric, Impactstory and Plum.
Article
Full-text available
What is my impact? This question has become ever more pressing for all actors in the scientific and scholarly system. The shift to web based publishing has enabled new impact measurement tools that may address the current limitations of peer evaluation and citation analysis. This paper presents the first comprehensive assessment of the limitations and strengths of the most current novel impact monitors. We link this analysis to the literature about alternative impact metrics, such as webometrics and altmetrics. We conclude that a set of interesting novel information tools have emerged and will probably keep emerging in the next few years. They enable a rough indication of publication impact for the individual researcher. At the same time, these tools do not yet produce metrics that can be used for more formal forms of research assessment at either the individual level or at the levels of research groups and universities. We discuss the reasons for this conclusion. We also propose to understand the development of these impact monitors by analyzing the interaction between 'technologies of narcissism', 'technologies of control' and 'users'. Introduction What is the scientific and social impact of my research publications? This question has been of interest to scientists and scholars since the inception of modern science 400 years ago. But it was hard to answer. During a large part of the history of modern science, most researchers could not know who was reading their work. The practical applications were also difficult to track, if at all possible. This may now be changing. Scientific and scholarly publishing is being pushed onto the internet (Jankowski, 2009). Scientists and scholars are routinely using web based applications in their research. In virtually all fields of research, digital web based tools have become indispensable (Dutton, Jeffreys & Goldin, 2010). This is not the result of a technological revolution, but rather of an evolutionary interaction between scholarly practices, technologies and research infrastructures (Hine, 2006, 2008; The Virtual Knowledge Studio & al., 2008; Wouters, Beaulieu, Scharnhorst & Wyatt, 2012). Scholarship is transforming into a variety of digital networked forms (Borgman, 2007). These changes are affecting the way researchers work, how they communicate their results and insights, and in what forms these results are codified in an archive of knowledge (Bulger & al., 2011; Gruzd, Goertzen & Mai, 2011; Williams & al., 2009). The emergence of new paradigms of open access is an important example of how these changes have affected the way scholars think about the future of academic publishing (Willinsky, 2006). These developments have created new possibilities and challenges in the evaluation of the quality of research, also at the level of individual researchers and career developments (Wouters & al., 2010).
Article
Full-text available
This paper analyzes Altmetric.com, one of the most important altmetric data providers currently used. We have analyzed a set of publications with DOI number indexed in the Web of Science during the period 2011-2013 and collected their data with the Altmetric API. 19% of the original set of papers was retrieved from Altmetric.com including some altmetric data. We identified 16 different social media sources from which Altmetric.com retrieves data. However five of them cover 95.5% of the total set. Twitter (87.1%) and Mendeley (64.8%) have the highest coverage. We conclude that Altmetric.com is a transparent, rich and accurate tool for altmetric data. Nevertheless, there are still potential limitations on its exhaustiveness as well as on the selection of social media sources that need further research.
Article
Altmetrics is the focus for this edition of “Balance Point.” The column editor invited Finbar Galligan who has gained considerable knowledge of altmetrics to co-author the column. Altmetrics, their relationship to traditional metrics, their importance, uses, potential impacts, and possible future directions are examined. The authors conclude that altmetrics have an important future role to play and that they offer the potential to revolutionize the analysis of the value and impact of scholarly work.
Article
With increasing uptake among researchers, social media are finding their way into scholarly communication and, under the umbrella term altmetrics, are starting to be utilized in research evaluation. Fueled by technological possibilities and an increasing demand to demonstrate impact beyond the scientific community, altmetrics have received great attention as potential democratizers of the scientific reward system and indicators of societal impact. This paper focuses on the current challenges for altmetrics. Heterogeneity, data quality and particular dependencies are identified as the three major issues and discussed in detail with an emphasis on past developments in bibliometrics. The heterogeneity of altmetrics reflects the diversity of the acts and online events, most of which take place on social media platforms. This heterogeneity has made it difficult to establish a common definition or conceptual framework. Data quality issues become apparent in the lack of accuracy, consistency and replicability of various altmetrics, which is largely affected by the dynamic nature of social media events. Furthermore altmetrics are shaped by technical possibilities and are particularly dependent on the availability of APIs and DOIs, strongly dependent on data providers and aggregators, and potentially influenced by the technical affordances of underlying platforms.
Article
First paragraph: No one can read everything. We rely on filters to make sense of the scholarly literature, but the narrow, traditional filters are being swamped. However, the growth of new, online scholarly tools allows us to make new filters; these alt-metrics reflect the broad, rapid impact of scholarship in this burgeoning eco-system. We call for more tools and research based on altmetrics.
Altmetric manifesto completes five years
  • B Dutta
Dutta, B. (2016), "Altmetric manifesto completes five years (2010-2015)", Current Science, Vol. 110 No. 1, p. 17.
Scholars seek better ways to track impact online
  • J Howard
Howard, J. (2012), "Scholars seek better ways to track impact online", The Chronicle of Higher Education, January 29, available at: www.chronicle.com/article/As-Scholarship-Goes-Digital/130482/ (accessed October 1, 2016).
  • N Robinson-García
  • D Torres-Salinas
  • Z Zahedi
  • R Costas
Robinson-García, N., Torres-Salinas, D., Zahedi, Z. and Costas, R. (2014), "New data, new possibilities: exploring the insides of altmetric.com", El Professional De La Información, Vol. 23 No. 4, pp. 359-366, doi: 10.3145/epi.2014.jul.03.