Content uploaded by Svetla Baykoucheva
Author content
All content in this area was uploaded by Svetla Baykoucheva on Sep 05, 2019
Content may be subject to copyright.
Managing Scientific Information and Research Data
Copyright © 2015, Svetla Baykoucheva. Published by Elsevier Ltd. All rights reserved.
14
Measuring attention: social
media and altmetrics
Very simple was my explanation, and plausible enough—as most
wrong theories are!
H.G. Wells (The Time Machine, 1895)
14.1 Introduction
We have heard of alternative music, alternative medicine, alternative energy, and many
other “alternative” things. There are even “alternative mutual funds.” In the academic
and publishing world, we now often hear about alternative metrics, or altmetrics. Why
are so many people and organizations interested in this new field? Altmetrics monitors
attention to scholarly output in the form of mentions on social media sites, scholarly
activity in online libraries, and reference managers and comments posted on scientific
blogs. Who talked about that paper? How did they engage? How can the impact of
their activity be measured?
Traditional methods of evaluating the impact of research were based on citations in
peer-reviewed journals. The journal impact factor (IF), the most widely used measure
of scientific impact, was developed in a print environment, but other alternative indica-
tors have also been studied by such fields as webometrics and bibliometrics (Baynes,
2012; Corrall et al., 2013).
After scholarly communication shifted mostly to online, there was a need for a
new approach to evaluate research. As stated in a document published by the NISO
Alternative Assessment Metrics (Altmetrics) Project, “While citations will remain an
important component of research assessment, this metric alone does not effectively mea-
sure the expanded scope of forms of scholarly communication and newer methods of on-
line reader behavior, network interactions with content, and social media” (NISO, 2014).
Authors like to know who is looking at their works and what other people think
about them. Why would you make the effort to publish if you do not care what others
think about your work or if they see the impact of it? This interest in other people’s
opinions has reached new highs with the development of advanced Internet technol-
ogies and the emergence of social media. What people are saying about an article is
being viewed as some kind of an indicator of interest. Many journals now display al-
ternative metric information at article level. Criteria such as the number of “likes” and
the number of Twitter followers are viewed by some as measures of research impact
(Holmberg and Thelwall, 2014; Kraker, 2014; Osterrieder, 2013; Sud and Thelwall,
2014). An article that examined how often Twitter was used to disseminate infor-
mation about journal articles in the biomedical field concluded that the correlation
between tweets and citations was very low (Haustein et al., 2014b).
128 Managing Scientific Information and Research Data
The open-access movement, widespread use of social media, availability of free
content, and new forms of scholarly output such as datasets led to the development
of new alternative metrics to analyze research. The tools and the data that can be col-
lected using them are grouped under the collective term “altmetrics.”
14.2 Measuring attention
The term “altmetrics” is an abbreviation of the phrase “alternative metrics”
(Galligan and Dyas-Correia, 2013). This new field and its supporters have the am-
bitious goal of providing an alternative to or an enhancement of traditional citation
metrics by measuring scholarly interactions taking place mainly in the social me-
dia (Galloway et al., 2013; Hoffmann et al., 2014; Kwok, 2013; Piwowar, 2013;
Piwowar and Priem, 2013; Sud and Thelwall, 2014; Taylor, 2012; Viney, 2013;
Wang et al., 2013; Wilson, 2013). These interactions may take the form of article
views, downloads, and tweets. They could also be collaborative annotations using
such tools as social bookmarking and reference managers and comments on blog
posts. Altmetrics companies obtain data from many different sources and gather
metrics for such digital artifacts as articles, blog posts, book chapters, books, cases,
clinical trials, conference papers, datasets, figures, grants, interviews, letters, me-
dia, patents, posters, presentations, source code, theses/ dissertations, videos, and
even web pages.
Plum Analytics (Plum Analytics, 2014a,b), a major player in the field of altmetrics,
separates collected data into the following five categories:
● Usage (e.g. downloads, views, book holdings, ILL, and document delivery).
● Captures (favorites, bookmarks, saves, readers, and groups).
● Mentions (citations from blog posts, news stories, Wikipedia articles, comments, and
reviews).
● Social media (tweets, +1’s, likes, shares, and ratings).
● Citations (retrieved from publicly available sources such as PubMed, Scopus, and patent
databases).
14.3 Altmetrics companies, applications, and tools
Companies and organizations involved in altmetrics collect data from many sources,
including social media outlets (Baynes, 2012; Brown, 2014; Buschman and Michalek,
2013; Cheung, 2013; Haustein et al., 2014a; Konkiel, 2013; Kwok, 2013; NISO, 2014;
Piwowar, 2013; Piwowar and Priem, 2013; Sud and Thelwall, 2014; Thelwall et al.,
2013; Wang et al., 2013; Wilson, 2013). While most altmetrics companies are us-
ing similar metrics, categories, and sources of data to those shown above for Plum
Analytics, there are also some differences in the approaches used by these companies.
This section provides information about the applications and tools currently used in
this field and the major players in it.
Measuring attention: social media and altmetrics 129
14.3.1 Academic Analytics
www.academicanalytics.com
Academic Analytics provides business intelligence data and solutions for research
universities in the United States and the United Kingdom that allow them to compare
academic institutions and help them better understand their strengths and in which
areas they need to improve.
14.3.2 altmetric.com
altmetric.com is a fee-based service that provides altmetrics tools to publishers, insti-
tutions, and researchers (Figure 14.1). Publishers subscribing to this service display
Article-Level Metrics (ALMs), which draws more visitors to their sites (Huggett and
Taylor, 2014).
The company gathers data about mentions of academic papers on social media sites
(e.g. Twitter, Facebook, Pinterest, and Google+), science blogs, mainstream media
outlets such as The New York Times and The Guardian, non-English language publi-
cations like Die Zeit and Le Monde, the peer-reviewed site Publons, and information
about references saved in collaborative reference managers.
altmetric.com uses rankings for their data analysis. For example, news items have
more weight than blogs, and blogs are more highly regarded than tweets. The algo-
rithm also takes into account how authoritative the authors are. Results are presented
visually with a donut that shows the proportional distribution of mentions by source
type, with each source type displaying a different color—blue (for Twitter), yellow
(for blogs), and red (for mainstream media sources). Elsevier displays the altmetric.
com donuts for a journal’s three top-rated articles on the Elsevier.com homepages of
many Elsevier titles. The donut also provides links to news and social media mentions.
Figure 14.1 The home page of altmetric.com.
Reproduced with permission from altmetric.com.
130 Managing Scientific Information and Research Data
A free Bookmarklet that can be installed in Firefox, Chrome, and Safari allows users
to see details of the use of a particular article (Figure 14.2).
14.3.3 Article-level Metrics (ALMs)
http://article-level-metrics.plos.org
ALMs (Figure 14.3) is a new approach introduced by PLOS to measure the impact
of published research at the article level. It looks at an individual article’s impact and
Figure 14.2 The altmetric.com Bookmarklet (www.altmetric.com/bookmarklet.php).
Reproduced with permission from altmetric.com.
Figure 14.3 PLOS Article-level Metrics website.
Measuring attention: social media and altmetrics 131
separates it from the impact of the journal it was published in. ALMs incorporates
altmetrics sources along with traditional measures to present a bigger picture of how
individual articles are being discussed and used.
14.3.4 CiteULike
www.citeulike.org
CiteULike is a free collaborative bibliographic management program that allows users
to store, organize, share, and discover scholarly references.
14.3.5 Impactstory
https://impactstory.org
Impactstory (Figure 14.4) is an open-source, web-based tool that is focused on re-
searchers. It tracks journal articles, preprints, datasets, presentation slides, research
codes, and other research outputs. It is known that Impactstory aggregates data from
Mendeley, GitHub, and Twitter, but it does not disclose all its sources. Impactstory
was founded by Heather Piwowar and Jason Priem, pioneers in altmetrics and also
prolific writers in the area of new publishing models and alternative approaches for
evaluating research (Piwowar, 2013; Piwowar and Vision, 2013; Piwowar et al., 2011).
14.3.6 InCites
http://researchanalytics.thomsonreuters.com/incites/
InCites from Thomson Reuters is using bibliographic and citation data from the
Web of Science to provide tools and solutions for assessing scholarly output.
InCites measures and benchmarks the research performance of individuals, organi-
zations, programs, and peers.
Figure 14.4 The home page of Impactstory (https://impactstory.org).
Reproduced with permission from Impactstory.
132 Managing Scientific Information and Research Data
Figure 14.5 Plum Analytics home page.
Reproduced with the permission of Plum Analytics.
14.3.7 Mendeley
www.mendeley.com
Mendeley is a free collaborative bibliographic management tool owned by Elsevier.
When searching Scopus, users can see the demographics, disciplines, and geographic
locations of people who have saved a particular article in Mendeley (Habib, 2014).
This kind of crowd sourcing might be of interest to many users, but there are some
researchers (especially those working in competitive fields) who are reluctant to use a
bibliographic management program that monitors their reading habits.
14.3.8 PaperCritic
www.papercritic.com
PaperCritic allows researchers to get feedback about their articles. Tags, summaries,
and in-text notes from an individual researcher’s Mendeley library are available to
PaperCritic users if they want to rate and review any publication.
14.3.9 Plum Analytics
www.plumanalytics.com/https://plu.mx
Plum Analytics (Plum Analytics, 2014a) (Figure 14.5) is a major player in the field
of altmetrics. Recently acquired by the aggregator EBSCO, it is a commercial service
targeting mainly libraries.
Measuring attention: social media and altmetrics 133
While most of the altmetrics companies do not disclose their sources of data,
Plum Analytics makes an exception by publishing the full list of metrics it supports
(Figure 14.6).
In June 2013, the Alfred P. Sloan Foundation awarded the National Information
Standards Organization (NISO) a grant “to undertake a two-phase initiative to explore,
identify, and advance standards and/or best practices related to a new suite of potential
metrics in the community” (NISO, 2015). The project has the goal of developing and
promoting new assessment metrics, such as usage-based metrics and social media
statistics.
14.4 Altmetrics and data provenance
If you have posted a work on some publicly available website, you may get regular up-
dates of how many times your work has been viewed and downloaded. These reports
come from different places: the original site where your work has been published,
altmetrics companies monitoring such outlets, or academic social websites, such as
academia.edu and ResearchGate, where you have posted your works.
I had not been paying much attention to such reports, until I suddenly noticed that a
report sent to me by an altmetrics company for a tutorial that I had posted on a social
website showed around 1000 counts more than a report for the same work that the
original company had sent me the day before. The altmetrics company confirmed that
their stats had been correct and matched the counts shown on the public site of the
company where the original work was posted.
Why would a company send authors results that were significantly different (much
lower) from what the same company is displaying publicly? A FAQs page of the
Figure 14.6 Screen capture of Plum Analytics list of sources. The full table is available at
www.plumanalytics.com/metrics.html.
Reproduced with the permission of Plum Analytics.
134 Managing Scientific Information and Research Data
company where my tutorial was posted gave some possible explanation for this dis-
crepancy. It turned out that some new analytics had been used to correct total counts
by excluding bot (search engine crawlers) views that had been counted in the total
views. These analytics, I was told by the company, had not been applied to the public
site, yet. This explains the difference between the results sent to authors (who received
the results normalized for bots) and those displayed for the general public and that
were collected by the altmetrics companies. There is also no way to know who is view-
ing or downloading the works—whether these are people who are really interested in
the works, or whether the authors themselves are repeatedly viewing and downloading
their own works to increase their counts.
14.5 Conclusion
While it takes years for traditional citations to accrue, the number of downloads,
views, and mentions in social media is reported in a matter of days, even hours (Wang
et al., 2013). Researchers engaging in social networks now rely significantly on rec-
ommendations from their peers about newly published articles. They can see what
their peers are finding, saving, and bookmarking. There is a general consensus that
having an article mentioned in social media could lead to future citations.
With budgets tightening and funding sources becoming more limited, scientific
research is becoming very competitive. When applying for grants, researchers need
to show that their research will have an impact. They can demonstrate such potential
impact for their past work, but not for recently published papers that might be the most
relevant to the grant proposal. If researchers could show that their recent work is gen-
erating a lot of interest, this could give them an advantage in getting funded.
Academic libraries are now looking at this new field as an opportunity to play a
more prominent role in their organizations (Galloway et al., 2013). They organize
seminars, invite representatives from altmetrics companies, and engage in different
initiatives aimed at evaluating opportunities for adopting alternative metrics and at
the same time educating researchers about this new field (Brown, 2014; Corrall et al.,
2013; Lapinski et al., 2013; Wilson, 2013). Professional organizations such as the
American Chemical Society devote technical sessions and panel discussions to al-
ternative metrics, thus allowing subject librarians and researchers to get acquainted
with the field. The altmetrics companies targeting academic institutions require paid
subscriptions, and librarians are in a good position to get involved in the selection,
introduction, and promotion of such services.
There are significant differences between the disciplines in using social media,
which affects whether alternative metrics could be applied for certain fields (Haustein
et al., 2014a; Holmberg and Thelwall, 2014; Liu et al., 2013; Zahedi et al., 2014).
Scholars in the humanities and the social sciences are very interested in altmetrics.
The research output in these disciplines is mainly in the form of books and book chap-
ters, and traditional citation analysis, which has been applied most often to disciplines
where journal articles have been the main research product, does not serve them well
when they are considered for promotion.
Measuring attention: social media and altmetrics 135
It is difficult to predict how the field of altmetrics will develop in the near future,
but one thing is certain—its acceptance will be much slower in disciplines where
citations in peer-reviewed journals with high IF are major criteria for researchers’ pro-
motion. Finding a correlation between citations and counts of downloads, bookmarks,
and tweets has been attempted in recent years, but there is no definitive conclusion
whether such correlation exists (Baynes, 2012; Zahedi et al., 2014).
The provenance of data and interpreting the collected data will be the most import-
ant and challenging issues confronting altmetrics companies in the future. Counts mean
nothing, unless they can be interpreted. Altmetrics companies need to explain what the
value of their services is. Although altmetrics offers an interesting insight into how schol-
arly output attracts attention, it will complement rather than replace traditional methods
such as citations in peer-reviewed journals (Brody, 2013; Brown, 2014; Cheung, 2013).
References
Baynes, G., 2012. Scientometrics, bibliometrics, altmetrics: some introductory advice for the
lost and bemused. Insights 25 (3), 311–315. http://dx.doi.org/10.1629/2048-7754.25.3.311.
Brody, S., 2013. Impact factor: imperfect but not yet replaceable. Scientometrics 96 (1), 255–7.
http://dx.doi.org/10.1007/s11192-012-0863-x.
Brown, M., 2014. Is altmetrics an acceptable replacement for citation counts and the impact
factor? Ser. Libr. 67 (1), 27–30. http://dx.doi.org/10.1080/0361526X.2014.915609.
Buschman, M., Michalek, A., 2013. Are alternative metrics still alternative? Retrieved September
3, 2014, from http://www.asis.org/Bulletin/Apr-13/AprMay13_Buschman_Michalek.html.
Cheung, M.K., 2013. Altmetrics: too soon for use in assessment. Nature 494 (7436), 176. http://
dx.doi.org/10.1038/494176d.
Corrall, S., Kennan, M.A., Afzal, W., 2013. Bibliometrics and research data management ser-
vices: emerging trends in library support for research. Libr. Trends 61 (3), 636–74.
Galligan, F., Dyas-Correia, S., 2013. Altmetrics: rethinking the way we measure. Ser. Rev. 39
(1), 56–61. http://dx.doi.org/10.1080/00987913.2013.10765486.
Galloway, L.M., Pease, J.L., Rauh, A.E., 2013. Introduction to altmetrics for science, technol-
ogy, engineering, and mathematics (STEM) librarians. Sci. Technol. Libr. 32 (4), 335–45.
Habib, M., 2014. Mendeley Readership Statistics available in Scopus. Retrieved from http://
blog.scopus.com/posts/mendeley-readership-statistics-available-in-scopus.
Haustein, S., Peters, I., Bar-Ilan, J., Priem, J., Shema, H., Terliesner, J., 2014a. Coverage and
adoption of altmetrics sources in the bibliometric community. Scientometrics, 101 (2),
1145–63. http://dx.doi.org/10.1007/s11192-013-1221-3.
Haustein, S., Peters, I., Sugimoto, C.R., Thelwall, M., Lariviere, V., 2014b. Tweeting biomed-
icine: an analysis of tweets and citations in the biomedical literature. J. Assoc. Inf. Sci.
Technol. 65 (4), 656–69. http://dx.doi.org/10.1002/asi.23101.
Hoffmann, C.P., Lutz, C., Meckel, M., 2014. Impact factor 2.0: applying social network analy-
sis to scientific impact assessment. Paper presented at the 2014 47th Hawaii International
Conference on System Sciences (HICSS), Waikoloa, HI.
Holmberg, K., Thelwall, M., 2014. Disciplinary differences in Twitter scholarly communica-
tion. Scientometrics, 101 (2), 1027–42. http://dx.doi.org/10.1007/s11192-014-1229-3.
Huggett, S., Taylor, M., 2014. Elsevier expands metrics perspectives with launch of new altmetrics pi-
lots, Editors’ Update. March 3, 2014. Retrieved June 26, 2014, from http://editorsupdate. elsevier.
com/issue-42-march-2014/elsevier-altmetric-pilots-offer-new-insights-article-impact/.
136 Managing Scientific Information and Research Data
Konkiel, S., 2013. Altmetrics: a 21st-century solution to determining research quality. Online
Searcher 37 (4), 10–15.
Kraker, P., 2014. All metrics are wrong, but some are useful. Retrieved June 26, 2014, from
http://science.okfn.org/tag/altmetrics.
Kwok, R., 2013. Research impact: altmetrics make their mark. Nature 500 (7463), 491–3. http://
dx.doi.org/10.1038/nj7463-491a.
Lapinski, S., Piwowar, H., Priem, J., 2013. Riding the crest of the altmetrics wave: how librari-
ans can help prepare faculty for the next generation of research impact metrics. Coll. Res.
Libr. News 74 (6), 292–294.
Liu, C.L., Xu, Y.Q., Wu, H., Chen, S.S., Guo, J.J., 2013. Correlation and interaction visualiza-
tion of altmetric indicators extracted from scholarly social network activities: dimensions
and structure. J. Med. Internet Res. 15 (11), e259.
NISO, 2014. Alternative Metrics Initiative. Retrieved September 1, 2014, from http://www.niso.
org/topics/tl/altmetrics_initiative/#resources.
NISO, 2015. NISO Alternative Metrics (Altmetrics) Initiative. Retrieved January 8, 2015, from
http://www.niso.org/topics/tl/altmetrics_initiative.
Osterrieder, A., 2013. The value and use of social media as communication tool in the plant
sciences. Plant Methods 9, 26. http://dx.doi.org/10.1186/1746-4811-9-26.
Piwowar, H., 2013. Altmetrics: value all research products. Nature 493 (7431), 159. http://
dx.doi.org/10.1038/493159a.
Piwowar, H., Priem, J., 2013. The power of altmetrics on a CV. Retrieved September, 2014,
from https://asis.org/Bulletin/Apr-13/AprMay13_Piwowar_Priem.html.
Piwowar, H.A., Vision, T.J., 2013. Data reuse and the open data citation advantage. PeerJ 1,
e175. http://dx.doi.org/10.7717/peerj.175.
Piwowar, H.A., Vision, T.J., Whitlock, M.C., 2011. Data archiving is a good investment. Nature
473 (7347), 285. http://dx.doi.org/10.1038/473285a.
Plum Analytics, 2014a. Plum Analytics: Measuring impact. Retrieved from http://www.
plumanalytics.com/.
Plum Analytics, 2014b. Plum Analytics: Metrics. Retrieved from http://www.plumanalytics.
com/metrics.html.
Sud, P., Thelwall, M., 2014. Evaluating altmetrics. Scientometrics 98 (2), 1131–43. http://dx.
doi.org/10.1007/s11192-013-1117-2.
Taylor, M., 2012. The new scholarly universe: are we there yet? Insights 25 (1), 12–17. http://
dx.doi.org/10.1629/2048-7754.25.1.12.
Thelwall, M., Haustein, S., Larivière, V., Sugimoto, C.R., 2013. Do altmetrics work? Twitter
and ten other social web services. PLoS One 8 (5), 7. http://dx.doi.org/10.1371/journal.
pone.0064841.
Viney, I., 2013. Altmetrics: research council responds. Nature 494 (7436), 176. http://dx.doi.
org/10.1038/494176c.
Wang, X., Wang, Z., Xu, S., 2013. Tracing scientist’s research trends realtimely. Scientometrics
95 (2), 717–29. http://dx.doi.org/10.1007/s11192-012-0884-5.
Wilson, V., 2013. Research methods: altmetrics. Evid. Based Libr. Inf. Pract. 8 (1), 126–128.
Zahedi, Z., Costas, R., Wouters, P., 2014. How well developed are altmetrics? A cross-
disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications.
Scientometrics, 101 (2), 1491–513. http://dx.doi.org/10.1007/s11192-014-1264-0.