ArticlePDF AvailableLiterature Review

Systematic Literature Review on the Spread of Health-related Misinformation on Social Media

Authors:

Abstract and Figures

Contemporary commentators describe the current period as "an era of fake news" in which misinformation, generated intentionally or unintentionally, spreads rapidly. Although affecting all areas of life, it poses particular problems in the health arena, where it can delay or prevent effective care, in some cases threatening the lives of individuals. While examples of the rapid spread of misinformation date back to the earliest days of scientific medicine, the internet, by allowing instantaneous communication and powerful amplification has brought about a quantum change. In democracies where ideas compete in the marketplace for attention, accurate scientific information, which may be difficult to comprehend and even dull, is easily crowded out by sensationalized news. In order to uncover the current evidence and better understand the mechanism of misinformation spread, we report a systematic review of the nature and potential drivers of health-related misinformation. We searched PubMed, Cochrane, Web of Science, Scopus and Google databases to identify relevant methodological and empirical articles published between 2012 and 2018. A total of 57 articles were included for full-text analysis. Overall, we observe an increasing trend in published articles on health-related misinformation and the role of social media in its propagation. The most extensively studied topics involving misinformation relate to vaccination, Ebola and Zika Virus, although others, such as nutrition, cancer, fluoridation of water and smoking also featured. Studies adopted theoretical frameworks from psychology and network science, while co-citation analysis revealed potential for greater collaboration across fields. Most studies employed content analysis, social network analysis or experiments, drawing on disparate disciplinary paradigms. Future research should examine susceptibility of different sociodemographic groups to misinformation and understand the role of belief systems on the intention to spread misinformation. Further interdisciplinary research is also warranted to identify effective and tailored interventions to counter the spread of health-related misinformation online.
Content may be subject to copyright.
Contents lists available at ScienceDirect
Social Science & Medicine
journal homepage: www.elsevier.com/locate/socscimed
Review article
Systematic Literature Review on the Spread of Health-related
Misinformation on Social Media
Yuxi Wang
a,
, Martin McKee
b
, Aleksandra Torbica
a
, David Stuckler
c
a
Centre for Research on Health and Social Care, Department of Social and Political Science, Bocconi University, Italy
b
London School of Hygiene and Tropical Medicine, United Kingdom
c
Department of Social and Political Science, Bocconi University, Italy
ARTICLE INFO
Keywords:
Misinformation
Fake news
Health
Social media
ABSTRACT
Contemporary commentators describe the current period as an era of fake newsin which misinformation,
generated intentionally or unintentionally, spreads rapidly. Although aecting all areas of life, it poses particular
problems in the health arena, where it can delay or prevent eective care, in some cases threatening the lives of
individuals. While examples of the rapid spread of misinformation date back to the earliest days of scientic
medicine, the internet, by allowing instantaneous communication and powerful amplication has brought about
a quantum change. In democracies where ideas compete in the marketplace for attention, accurate scientic
information, which may be dicult to comprehend and even dull, is easily crowded out by sensationalized news.
In order to uncover the current evidence and better understand the mechanism of misinformation spread, we
report a systematic review of the nature and potential drivers of health-related misinformation. We searched
PubMed, Cochrane, Web of Science, Scopus and Google databases to identify relevant methodological and
empirical articles published between 2012 and 2018. A total of 57 articles were included for full-text analysis.
Overall, we observe an increasing trend in published articles on health-related misinformation and the role of
social media in its propagation. The most extensively studied topics involving misinformation relate to vacci-
nation, Ebola and Zika Virus, although others, such as nutrition, cancer, uoridation of water and smoking also
featured. Studies adopted theoretical frameworks from psychology and network science, while co-citation
analysis revealed potential for greater collaboration across elds. Most studies employed content analysis, social
network analysis or experiments, drawing on disparate disciplinary paradigms. Future research should examine
susceptibility of dierent sociodemographic groups to misinformation and understand the role of belief systems
on the intention to spread misinformation. Further interdisciplinary research is also warranted to identify ef-
fective and tailored interventions to counter the spread of health-related misinformation online.
1. Introduction
The spread of misinformation is not new, dating back at least to the
early days of printing. Even the term fake news, which has achieved
considerable contemporary prominence, was rst coined in 1925, when
an article in Harper's Magazine, entitled Fake News and the Public
mourned how newswires were allowing misinformation to disseminate
rapidly (McKernon, 1925). The growth of the Internet has, however,
initiated a fundamental change. In 2013, the World Economic Forum
warned that potential digital wildrescould cause the viral spread
of intentionally or unintentionally misleading information (World
Economic Forum, 2013). In the health arena, much concern has focused
on the spread of misinformation on immunisation, with social media
acting as a powerful catalyst for the anti-vaxxer movement.By
encouraging individuals not to vaccinate their children, this movement
has been linked to recent measles outbreaks in countries such as the UK,
the US, Germany and Italy (Datta et al., 2017;Filia et al., 2017). The
prevalence and persistence of such misinformation justies a careful
and systematic review of published literature on the nature and the
mechanisms by which misinformation spreads.
1.1. Dening terminology: what is misinformation?
We rst review the distinctions between various terms that relate to
misinformation. Following the 2016 US presidential election, the term
fake newsattracted substantial media and scholarly attention. The
term overlaps with other forms of misleading information, and espe-
cially misinformation and disinformation, all conveying messages,
https://doi.org/10.1016/j.socscimed.2019.112552
Received 21 January 2019; Received in revised form 29 August 2019; Accepted 12 September 2019
Corresponding author. Department of Social and Political Science, Bocconi University, Via Guglielmö Röntgen 1, 20136 Milan, MI, Italy.
E-mail address: yuxi.wang@phd.unibocconi.it (Y. Wang).
Social Science & Medicine 240 (2019) 112552
Available online 18 September 2019
0277-9536/ © 2019 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/BY-NC-ND/4.0/).
T
stories, theories, or opinions that spread rapidly through social contacts
or online media. They dier primarily with respect to intent and mode
of spread. Misinformation involves information that is inadvertently
false and is shared without intent to cause harm, while disinformation
involves false information knowingly being created and shared to cause
harm (Wardle and Derakhshan, 2017). Although fake newsis the
term that received most popular attention, it is arguably the most
problematic one in terms of denitional rigour. Lazer et al. (2018)
described it as fabricated information that mimics news media content,
but this does not capture the complexity of the phenomenon, which can
include both satire and information created deliberately to mislead as a
means to achieve a political or other goal (Wardle, 2017). A recent
report by a parliamentary committee in the UK concluded that The
term fake newsis bandied around with no clear idea of what it means,
or agreed denition (House of Commons, 2019). The term has taken on
a variety of meanings, including a description of any statement that is
not liked or agreed with by the reader. We recommend that the Gov-
ernment rejects the term fake news, and instead puts forward an
agreed denition of the words misinformationand disinformation’”.
Since the phrase also has been politicized by powerful gures to dis-
credit certain news media (Vosoughi et al., 2018), we refrain from using
the term fake newsthroughout the paper.
While noting these distinctions, in practice it often seems dicult to
dierentiate these categories because of the problem in ascertaining
intent. For example, anti-vaccine propaganda may be spread by those
who have a genuine concern, however misguided, about safety, and by
those who are using the issue as a tool to undermine trust in particular
governments. Thus, unless the intent is clear, we use the term mis-
information as an umbrella term to include all forms of false information
related to health, thereby giving those generating it the benet of the
doubt.
1.2. Misinformation spread from micro-to macro-level
Before discussing the macro-phenomenon of misinformation spread,
we rst conceptualize the potential mechanism following Wardle and
Derakhshan (2017). Three major components are involved in the
creation, production, distribution and re-production of misinformation
agent, message and interpreter (Wardle and Derakhshan, 2017). Our
review will look at whether and how existing literature from dierent
disciplines examine the type of actor behind the creation of health-re-
lated messages on social media platforms, the descriptive features of the
message the durability and distribution of accurate and misleading
information - and most importantly, the interpreter's response and how
it contributes to the reproduction of misinformation. At the micro-level,
individuals who receive misinformation form judgement about the
believability of the message, depending on information source, narra-
tive and context, while the tendency to spread depends on the degree to
which receivers suspect such misinformation (Karlova and Fisher,
2013). At the macro-level, we observe patterns of misinformation cas-
cade and characteristics of networks.
Early literature on spread of rumours (circulating stories or reports
of uncertain or doubtful truth) identied the basic law of rumour”–
the amount of rumour in circulation will vary with the importance of
the subject to the individuals concerned times the ambiguity of the
evidence pertaining to the topic in question (Allport and Postman,
1947). The link between psychological and cultural dimensions gen-
erated intriguing questions on what makes misinformation so easy to
spread and so hard to debunk.
According to Allport and Postman (1947), the ambiguity of the
message may be due to the receipt of conicting stories, with no one
more credible than another. The concept of credibility, as investigated
extensively in communications research, encompasses message cred-
ibility, source credibility, and media credibility (Metzger et al., 2003).
With traditional media, each aspect of information credibility is rela-
tively well understood, although even there some caution is needed. In
contrast, with social media, it is particularly challenging to assess the
source credibility, as users themselves are the self-publisher, subject to
no form of factual verication or accountability. We do know that
people regard information from the internet as being as credible as
conventional media such as television and radio, but not as that from
newspapers (Johnson and Kaye, 1998;Kim and Johnson, 2009). Many
studies have thus analysed the credibility of user-generated contents
and the cognitive process involved in the decision to spread online
information on social and political events (Abbasi and Liu, 2013;
Castillo et al., 2011;Lupia, 2013;Swire et al., 2017). This research has
highlighted the importance of source credibility and persuasiveness as
factors aecting the susceptibility of users to the messages conveyed.
Other relevant studies have focused on important concepts such as
misperception and conrmation bias, whereby people's views on factual
matters are strongly inuenced by prior beliefs (Taber and Lodge, 2006;
Nyhan and Reier, 2010;Jerit and Barabas, 2012); polarization within
networks (Lewandowsky et al., 2012); and the combined eects of these
phenomenon facilitated by social media (Del Vicario et al., 2016;
Boutyline and Willer, 2017;Shao et al., 2018). While much of the ex-
isting literature has examined social and political issues, we focus on
misinformation related to health and wellbeing.
1.3. Misinformation and health: gaps in the evidence base
There is limited understanding of why certain individuals, societies
and institutions are more vulnerable to misinformation about health.
This is perhaps surprising, as health promotion and public health re-
searchers now pay considerable attention to the potential of the in-
ternet as a tool to diuse health-related information (Chew and
Eysenbach, 2010;Ritterband and Tate, 2009;Murray et al., 2009;
Scanfeld et al., 2010;Signorini et al., 2011), employing smart phones
and other mobile technologies in preventative interventions (Abroms
et al., 2013;Eng and Lee, 2013;Free et al., 2013;Steinhubl et al.,
2015). Although the internet provides immense opportunities, it also
lowers the cost of generating and disseminating information, allowing
misinformation and sensationalized stories to propagate. What was
once spread locally can rapidly become global, with ideas no longer
conned or delayed by geography. This has generated a series of studies
of information diusion (Serrano et al., 2015), rumour spread (He
et al., 2015), and consequent behavioural changes (Salathé and
Khandelwal, 2011;Wakamiya et al., 2016). These generally employ
sophisticated modelling and simulation techniques to identify the ru-
mour propagation dynamics. However, this is still in its infancy and one
recent systematic review of behavioural change models found that most
papers investigating spread of health-related information and beha-
vioural changes are theoretical, failing to use real-life social media data
(Verelst et al., 2016). The literature on misinformation spread is
growing, but spans disparate disciplines, including communication
studies, epidemiology, psychology, and computational science. We
contend that it is now necessary to integrate the dierent perspective
and methodologies, to understand the characteristics of susceptible
populations and to devise interventions that are most eective in
countering this spread.
To address this gap and provide a comprehensive view on the
available evidence, we undertake what is, to our knowledge, the rst
systematic review of studies that investigated the health-related mis-
information content on social media and how it spreads online. We
include papers stemming from dierent disciplines and we analyse
them on dierent dimensions.
First, we identify the main health-related topics where mis-
information tends to spread and the descriptive features of mis-
information. By focusing on the content and the spread of dierent
health-related misinformation, we reveal a broad landscape of issues
that attract actors to espouse misleading claims. The ndings shed light
on the extent to which dierent topics are identied and investigated in
the literature. This approach can inform those working in these areas.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
2
This seeks to inform social scientists, psychologists, and experts in
other elds working to understand this issue, who may otherwise
overlook the range of theories that underpin the work of researchers
seeking to conceptualize the spread of misinformation. As this is a
phenomenon that can be examined from many dierent perspectives,
we have undertaken a co-citation analysis to assess the extent to which
dierent disciplinary paradigms are informing each other, thereby fa-
cilitating future interdisciplinary research that can contribute to a more
inclusive theoretical framework.
We then explore the existing theories used to explain the phenom-
enon and undertake a co-citation analysis to ascertain the extent to
which ideas spread among disciplinary communities.
We further discuss the dierent empirical strategies adopted in the
analysis. In doing so, we identify the social media platforms where the
authors obtain the empirical data, how they incorporate dierent sta-
tistical models to interpret the data, and the empirical progress in our
understanding of the mechanism. We conclude by examining the po-
tential for future interdisciplinary research and practical interventions
to counter misinformation spread.
2. Methods
2.1. Design and search strategy
Our reporting strategy follows the PRISMA guidelines (Moher et al.,
2009). We searched PubMed, Cochrane, Web of Science (WoS) and
Scopus for records published between January 2012 and November
2018, using the following search terms in title and abstract:
(i) [misinformation OR fake news OR disinformation OR rumo* OR
false OR mislead*]
AND
(ii) [online OR social OR media OR news OR twitter OR Facebook OR
google]
AND
(iii) [spread OR propagate* OR disseminat* OR circulat* OR commu-
nicat* OR diuse OR broadcast]
AND
(iv) [health OR disease OR infectious OR virus OR vaccin* OR Ebola
OR Zika OR measles]
This yielded 206 records from PubMed, 33 records from Cochrane,
341 records from Web of Science, 51 records from Scopus and 62 re-
cords from Google (Fig. 1.). We identied and removed duplicates,
which resulted in 651 records that were rst screened based on title,
abstract, and keywords and then using full-text where necessary. All
eligible references were uploaded into reference management software
(Mendeley) for assessment of eligibility.
2.2. Screening and eligibility assessment
Next, we screened the results of the 651 records based on title and
abstract. Articles that were not original, not involving social media, not
related to health, not in English and not on human subjects were ex-
cluded. At this, and the subsequent stage, we also excluded the very
extensive literature on individual cognitive biases, which would be well
beyond the scope of a single review. Similarly, we excluded research on
static group decision-making, which can create misinformation (e.g. the
phenomenon termed groupthink), that subsequently spreads.
This left 131 potentially eligible papers, which were subject to full-
text analysis, applying the following pre-specied eligibility criteria:
Misinformation. Only records that concern misinformation, disin-
formation, fake news, rumour or any form of information disorder were
included.
Social media. Misinformation had to be propagated through online
media.
Health. Only records related to disease, treatments, public health
and wellbeing were included.
Model or empirical. Modelling (e.g. epidemiological, rumour spread)
studies or empirical analysis of the distribution or the dynamic eect of
misinformation.
Humans. We are interested in humans and behaviour of humans,
and therefore excluded studies about animals and plants.
Original research. We excluded review articles and editorials.
Language. We excluded articles written in languages other than
English.
Finally, we excluded papers that lacked analytic rigour or did not
incorporate misinformation as the main component of the analysis,
which resulted in 57 articles. The PRISMA (Fig. 1)shows the results of
these exclusions.
2.3. Data extraction
For the 57 included studies, we analysed the following elements in
the full-text: (i) health-related issues and ndings; (ii) theoretical fra-
mework (if any) and disciplines; (iii) study design.
2.4. Co-citation analysis
To gain further insights on the disciplines contributing to this in-
creasing area of research, we conducted a co-citation analysis of eligible
articles to measure the frequency with which two sources are cited
together by other documents. Co-citation analysis yields insight into
potential disciplinary siloes and theoretical or methodological gaps in
the literature. This was possible with 121 of the papers because 10
articles were not indexed on Scopus, where we extracted citation data
from.
3. Results
Fig. 2 shows the number of potentially eligible articles by year. Not
surprisingly, the number of studies that investigated health-related
misinformation increased over the years, from 7 in 2012 to 41 in 2018
(November) with a sharp rise in 2017. The trend implied the growing
scholarly interest in the social phenomenon, potentially amplied by
major political events in 2016. We exclude certain articles (n = 74) due
to their lack of analysis or interpretation of misinformation as men-
tioned above, and the remainder of this result section relates only to the
57 remaining papers after full-text analysis.
Key features of the studies included are in the web appendix. We
rst investigated what health-related topics have been studied in rela-
tion to misinformation. The largest category relates to communicable
diseases (n = 30), including vaccination in general (8) and specically
against Human Papilloma Virus (HPV), Measles, Mumps and Rubella
(MMR) and inuenza (3, 2 and 1 respectively), as well as infections
with Zika virus (9), Ebola (4), inuenza (1), Middle East Respiratory
Syndrome (1) and Nile Virus (1). Many articles concern chronic non-
communicable diseases such as cancer (3), cardiovascular disease (3),
psoriasis (1) and bowel disease (1). Some also address issues of diet and
nutrition (3), smoking (3) and water safety or quality (2). Five studies
cover a broad range of health-related misinformation or rumour online,
while the remaining studies were placed in a miscellaneous category,
addressing other specic diseases, health problems or medical inter-
ventions (Fig. 3). We now briey describe each of these in turn.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
3
Fig. 1. PRISMA ow diagram.
Fig. 2. Numbers of potentially eligible articles.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
4
3.1. Health-related issues and ndings
3.1.1. Vaccines and communicable diseases
Vaccine uptake, especially in children, has uctuated in recent
decades in many developed countries, with marked declines during
certain periods. In 2012, the journal Vaccine devoted a special issue to
The Role of Internet Use in Vaccination, analysing some of the
communication strategies used by both the anti-vaccination movement
and public health professionals. Authors recommended comprehensive,
structured, and easily understandable responses to anti-vaccination
messages (Betsch and Sachse, 2012;Kata, 2012;Reyna, 2012;
Nicholson and Leask, 2012). Although refusal of vaccination and
movements opposing vaccines date back to the time of Jenner, pub-
lication of fraudulent research linking the MMR vaccine to autism and
bowel disease (Wakeeld et al., 1998) was a seminal moment. The
concerns raised then, although long since discredited, have been widely
disseminated on social media and even now are highly inuential
among some groups. For instance, Basch et al. (2017),Donzelli et al.
(2018) and Porat et al. (2018) report high online prevalence and po-
pularity of autism-related discussions in fora on vaccination. Tustin
et al. (2018) and Xu and Guo (2018) also reported widespread mis-
information about side eects, as well as mistrust in government or
pharmaceutical companies in discussions on vaccination. Krishna's
(2017) study of active propagators of these messages found that those
who were knowledge-decient and vaccine-averse exhibit higher levels
of activity than those who are not. Aquino et al. (2017) reported a
signicant inverse correlation between MMR vaccination coverage and
online searches and social network activity on autism and MMR vac-
cine. Taken as a whole, the research identies anti-vaxxer and mem-
bers of online communities favouring conspiracy theories as sources or
propagators of misinformation, with discussions tending to revolve
around rhetorical and personal arguments that induce negative emo-
tions (fear, anger, sadness). Although there is less misinformation than
accurate information, the former has greater popularity among viewers.
The Zika epidemic stimulated considerable activity on Twitter
(Wood, 2018) and Facebook (Sharma et al., 2017), as well as spread of
news items (Sommariva et al., 2018), images (Seltzer et al., 2017), and
videos (Bora et al., 2018) on a range of media. Conspiracy theories
directed at institutions feature frequently in these discussions. For in-
stance, the Zika virus was portrayed as a bioweapon, while rumours
spread that the Zika vaccine had been developed to depopulate the
earth (Sommariva et al., 2018;Wood, 2018). Conspiracist ideation
played a crucial role in one's belief in misinformation (Lewandowsky
et al., 2013). However, Bode and Vraga (2018) did not nd that belief
in conspiracies reduced receptiveness to correction of misinformation
on Zika virus, although this research generated several important in-
sights for design of interventions to address this issue.
The Ebola outbreak also provided much additional material. For
instance, Fung et al. (2016) examined the role of Twitter and Sina
Weibo (Chinese microblog, equivalent to Twitter) in spreading rumours
and speculating on treatments. Pathak et al. (2015) found numerous
misleading videos online concerning Ebola virus disease. Similar to the
studies on vaccination, much of this misinformation comes from in-
dividuals who are highly active in inuencing opinions, and rumours
often garner higher popularity than evidence-based information.
3.1.2. Chronic non-communicable diseases
Though most research on misinformation has focused on infectious
disease, misinformation on chronic illnesses such as cancer and cardi-
ovascular disease are not uncommon on social media. Okuhara et al.
(2017) looked at online discussions with opposing views on cancer
screening in Japan, nding that most propagated anti-cancer screening
messages. Staying in Asia, Chen et al. (2018a,b) examined the nature
and diusion of misinformation on gynaecologic cancer in China. Chua
and Banerjee (2018) found that individuals are more likely to trust and
share cancer-related rumours if the rumours are dreadful rather than
wishful, and if one has had previous personal experience.
Studies on other chronic diseases mostly speculate on or promote
alternative treatments, for example on diabetes (Leong et al., 2017),
heart failure (Chen et al., 2013), hypertension (Kumar et al., 2014) and
psoriasis (Qi et al., 2016). Again, misleading videos are more inuen-
tial. In addition, research by Leong et al. (2017) in India found that
diabetes videos tailored to South Asians were more misleading than
those not culturally-targeted.
3.1.3. Others
Unsubstantiated messages regarding diets and nutrition can have
detrimental eects on susceptible individuals. For instance, Syed-Abdul
et al. (2013) investigated how anorexia is promoted as fashion and
linked to ideas of beauty in YouTube videos, gaining high popularity
among young female viewers. Bessi et al. (2015), analysing the diu-
sion of diet, environment and geopolitics-related misinformation, found
that active users are more likely to span a range of categories, and that
online groups promoting conspiracy theories tend to exhibit
Fig. 3. Topic categories.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
5
polarization. Similar patterns are observed in discussions on water
uoridation, as memorably invoked in the 1964 movie Dr. Strangelove.
Seymour et al. (2015) analysed the anti-uoride network online and
found that strong ties among the community are obstacles for expert
opinions to be accepted. This indicates that social homogeneity may
well be the primary driver of content diusion and clustering. The
modelling of rumour spread is therefore informative of the cascades'
size and potential intervention designs in countering such spread.
The tobacco industry has a long history of distorting scientic evi-
dence and misleading consumers. Very recently, Albarracin et al.
(2018) showed how misleading portrayal of tobacco's health con-
sequences introduces positivity towards smoking. The advent of elec-
tronic cigarettes prompted Harris et al. (2014) to examine content and
tweet patterns related to an e-cigarette campaign by a local public
health department. The misinformation included arguments that divert
attention from the products to messages that sought to discredit au-
thorities.
A few studies have investigated specically the psychology of in-
dividuals who believe and share rumours. Chua and Banerjee (2017),in
their analysis on epistemic belief and its eect on the decision to share
rumour, showed that epistemologically naïve users have higher pro-
pensity to share online health rumours. Li and Sakamoto (2015) dis-
covered that exposing individuals to measures of collective opinion,
through counts of retweets and collective truthfulness ratings could
reduce the tendency to share inaccurate health-related messages. Taken
as a whole, the evidence indicates that the motivation to believe and
share rumours reects both individual and collective makings, but the
consequences are dicult to predict because of the complex psycho-
logical factors involved.
Finally, the group of miscellaneous studies mainly examined spe-
cic medical interventions or issues such as drugs (Al Khaja et al.,
2018), paediatric disease (Strychowsky et al., 2013), abortion (Bryant
et al., 2014), dialysis (Garg et al., 2015), suicide (Li et al., 2018) and
multiple sclerosis (Lavorgna et al., 2018). The common sources of
misinformation included advertisements or comments related to ad-
vertisements (Garg et al., 2015) and patients' anecdotal experiences
(Strychowsky et al., 2013). Again, misinformation was more popular
than factual messages.
3.2. Theoretical frameworks and disciplines (co-citation analysis)
We next investigated the theoretical foundations in the included
studies, but it rapidly became clear that there was no widely agreed
approach to this phenomenon, reecting the broad range of disciplines
that have investigated it. The more dominant disciplines and research
areas according to the published journals include public health, health
policy and epidemiology (n = 14), health informatics (n = 8), com-
munications studies (n = 5), vaccines (=4), cyberpsychology (n = 3)
and system sciences (n = 3).
Disciplinary approaches adopted to conceptualize the phenomenon
are varied, but primarily fall within the elds of psychology (n = 8) and
communication (n = 4), as well as network science (n = 7). While
theories in psychology focus on individual-level cognitive response to
misinformation and its corrections, frameworks in network and data
science characterise the (online) societal mechanisms involved. For
instance, Chua and Banerjee (2018), in investigating the online beha-
viour in the face of health rumours, invoked the seminal rumour theory
(Allport and Postman, 1947), which views personal involvement as a
common perception that dictates one's decision to spread rumour.
Moreover, rumours that are repeatedly circulated can be reinforced and
accepted as credible (Rosnow, 1991), and the consequent perceived
high credibility can in turn increase intention to trust and share ru-
mours (Shin et al., 2017). This relates to credibility research, which
suggests that perceived credibility and can heighten the persuasive
impact, especially for internet users who are not motivated to process
information (Metzger, 2007;Metzger et al., 2010). Similarly, Ozturk
et al. (2015) explored how dierent social media settings can reduce
rumour spread based on rumour psychology research. Others have re-
ferred to psychological studies around conspiracist ideation, inocula-
tion theory and social conformity in understanding the mechanism
behind health misperception on social media (Bode and Vraga, 2018;
Bora et al., 2018;Li and Sakamoto, 2015). Contrastingly, the use of
system or network theories are aimed at explaining the patterns of
social inuence, social learning, social contagion and homophily and
polarization processes (Bessi et al., 2015;Radzikowski et al., 2016;
Schmidt et al., 2018;Sicilia et al., 2017;Wood, 2018). The framework
typically assists the subsequent social network analysis.
Two studies borrowed insights from philosophy Grant et al. (2015)
employed the rhetorical framework to examine the persuasive features
of pro- and anti-vaccine sites, while Chua and Banerjee (2017) used the
epistemology framework to explore the role of epistemic belief in af-
fecting rumour-sharing behaviour. Finally, situational theory of publics
(Grunig, 1997) from public relation studies are adopted to identify
vaccine-negative activists (Krishna, 2017). The remaining articles from
computational studies and clinical perspectives lack any theoretical
underpinning and are purely empirical.
Given that the ndings are from disparate disciplines, we conduct
the co-citation analysis on all the potentially eligible articles to identify
the clusters of disciplinary communities. In co-citation network ana-
lysis, the unit of analysis is the cited source, and we include the journals
cited at least 5 times within the 121 articles. As seen in Fig. 4, the
distance in the map between any pair of journals reects their similarity
to each other (van Eck and Waltman, 2010), and we use the LinLog/
modularity normalization technique to minimize the distance between
connected nodes (Noack, 2009). The size of the nodes represents the
number of citations, and the line indicates the presence of citation in
either direction. The analysis identied 4 distinct (inter-)disciplinary
clusters, which we assigned as follows (with randomly generated col-
ours, from left to right): Social Psychology and Communications (red),
General Science and Medicine (blue), Infectious Disease/Vaccine and
Public Health (green), Medical Internet and Biomedical Science
(purple). Overall, the literature is concentrated in general science and
vaccines/infectious diseases. Psychology and communications litera-
ture sit on the periphery, with relatively less cross-citation with the
science and medicine literature. Interestingly, we also observe a few
sociology journals at the bordering regions between clusters, implying
their incipient roles in acknowledging dierent insights across dis-
ciplines. There is potential for greater interdisciplinary collaboration.
3.3. Study design
Turning to research design, most studies employed content analysis
(n = 38) either alone or as a component of the analysis, studying var-
ious forms of social media (n = 10), YouTube videos (n = 12), Twitter
or equivalents (n = 8), websites (n = 5), images (n = 1) or mobile
messengers (n = 2). Authors observe the distribution of useful and
misleading information, and the pattern of consumption by dierent
users. Some studies incorporated social network analysis or epidemio-
logical modelling to better explain the dynamics of misinformation
spread (Bessi et al., 2015;Ghenai and Mejova, 2017;Harris et al., 2014;
Jin et al., 2014;Radzikowski et al., 2016;Wood, 2018). Many designs
were also complemented by sentiment measures, for instance, the anti-
vaccinesentiment (Bahk et al., 2016;Xu and Guo, 2018).
Seven studies used experimental designs. Bode and Vraga, in three
dierent papers, manipulated Facebook's related newsfunction to
conrm or correct (or both) misinformation about the purported link
between vaccines and autism, as well as unfounded link between ge-
netically modied organisms (GMO) and health (Bode and Vraga, 2015;
Vraga and Bode, 2017). They also simulated Twitter feeds with false
information about Zika virus to evaluate the ability of corrective re-
sponses to reduce misperception (Vraga and Bode, 2017). Chua and
Banerjee (2017,2018) undertook web-based experiments with
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
6
participants exposed to combinations of rumours and counter-rumours.
Ozturk et al. (2015) explored dierent ways to reduce rumour spread
on Twitter using Amazon's Mechanical Turk, an online crowdsourcing
platform. Albarracin et al. (2018) used the same platform to evaluate
the eects of YouTube videos on viewer attitudes to tobacco products.
A few studies used survey instruments to understand how social
media can spread misconceptions about Ebola in West Africa
(Adebimpe et al., 2015) and inammatory bowel disease in the USA
(Groshek et al., 2017), and to explore the relationship between
knowledge deciency and negative attitudes towards vaccines (Krishna,
2017). One case-study adopted an anthropological approach and used
thick description to review the rhetorical features of both pro-vaccine
and vaccine-sceptical websites Grant et al. (2015).
4. Discussion
4.1. Findings
We found that, while there have been studies of the spread of
misinformation on a wide range of topics, the literature is dominated by
those of infectious disease, including vaccines. Overall, existing re-
search nds that misinformation is abundant on the internet and is
often more popular than accurate information. Several of the studies
address areas where state action challenges individual autonomy. The
classic example is vaccination, where eective protection of the popu-
lation requires achievement of levels of uptake sucient to achieve
herd immunity. This review conrms that misconceptions about MMR
vaccine and autism, in particular, remain prevalent on social media
(Aquino et al., 2017;Chen et al., 2018a,b). Other topics share scientic
uncertainty, with the authorities unable to provide condent explana-
tions or advice, as with newly emerging virus infections such as Ebola
and Zika viruses (Basch et al., 2017;Fung et al., 2016;Sommariva et al.,
2018).
The agents that create misinformation are mostly individuals with
no ocial or institutional aliations. This relates to our initial dis-
cussions on credibility what makes a source trustworthy for readers?
Formal institutions are increasingly challenged by the rise of, for
instance, expert patient, blurring the boundaries between authority
and quasi-prociency (Seymour et al., 2015). Traditional vertical
health communication strategies are eroded by horizontal diusion of
conspiracy-like messages. The narratives of misinformation are domi-
nated by personal, negative and opinionated tones, which often induce
fear, anxiety and mistrust in institutions (Bessi et al., 2015;Panatto
et al., 2018;Porat et al., 2018). When people are frightened and
doubtful, they can be more susceptible to misinformation. Once false
information gains acceptance in such circumstances, it is dicult to
correct, and the eectiveness of interventions vary according to each
individual's personal involvement, literacy and socio-demographic
characteristics, features that tend to be under-explored in existing re-
search.
The included articles adopted disparate theoretical approaches in
conceptualizing the phenomenon, with the dominant frameworks from
the elds of psychology and network science. Theories employed in
psychology aimed to explain individual-level cognitive response of
misinformation and rumour online (Bode and Vraga, 2018;Bora et al.,
2018;Chua and Banerjee, 2018;Li and Sakamoto, 2015;Ozturk et al.,
2015), whereas network theories focus on the social mechanism and
patterns of misinformation spread (Bessi et al., 2015;Radzikowski
et al., 2016;Schmidt et al., 2018;Sicilia et al., 2017;Wood, 2018).
Further co-citation analysis on all articles that investigated the phe-
nomenon revealed that the disciplinary landscape concentrates around
general science and vaccines/infectious disease, while psychology and
communication studies have less cross-citation with the science and
medicine literature. The sociology discipline has great potential to
bridge the dierent communities.
Researchers have employed increasingly sophisticated analytic
techniques for empirical analysis, such as the use of social media data
for sentiment analysis. The majority of the articles included a content
analysis of the information on social media, ranging from text, images
and videos. Several studies employed complexity and network theories
to model the dynamics of rumour spread and opinion polarization
(Bessi et al., 2016;Jin et al., 2014). Other studies have adopted psy-
chological and linguistic perspectives (Fung et al., 2016;Li et al., 2018;
Waszak et al., 2018). While we have excluded research on both
Fig. 4. Co-citation analysis. We extracted citation data from Scopus and analysed citation patterns using network-clustering algorithms in VOSviewer 1.6.8. The
network map shows co-citation patterns of 121 journals cited at least 5 times within the studies that are potentially eligible. The node size represents the number of
citations, and the lines represent the presence of citation in either direction. We restricted the minimum cluster size to 20, which resulted in 4 disciplinary clusters
and 2367 links. We were not able to identify 10 articles because they were not indexed on Scopus, we therefore exclude them for the co-citation analysis.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
7
individual and group biases, we feel it is important to note how several
studies invoked the concept of conrmation bias, concluding that it plays
an important role in creating online echo-chambers (Bessi et al., 2015;
Donzelli et al., 2018). This highlights the need for much more research
on the socio-psychological characteristics of those who believe and
propagate misinformation. In particular, there is a need to understand
better the roles of both ideology and belief systems (Jost et al., 2018)
and what might be termed lazy thinking(Pennycook and Rand,
2018). For instance, although the role of literacy and cues to credibility
are critical concepts in the design of experiments, they should also be
explored in empirical studies, and especially those that use big data
from social media platforms.
4.2. Gaps and potential for future research
Although sociology and psychology pioneered research to under-
stand rumour (Allport and Postman, 1947;Bartlett, 1932;Kirkpatrick,
1932), psychologists are only beginning to study the implications of the
explosion in internet use (Stone and Wang, 2018). While we conclude
from the co-citation analysis that studies on misinformation in health
cover a wide range of disciplines, there is a marked lack of inter-
disciplinary research. This could, for example, allow hypotheses to be
generated by social scientists using rumour theory and tested using
quantitative analysis of social media data.
While most of the studies recommended courses of action based on
their results, only a handful of papers proposed specic and tested in-
terventions to reduce misinformation spread. For instance, Ozturk et al.
(2015) discovered that rumour-countering warnings such as this tweet
may contain misinformationdid decrease participants' likelihood of
sharing a rumour, consistent with ndings in the psychological litera-
ture (Bordia and Difonzo, 2004). Bode and Vraga (2018) showed that
algorithmic correction (by a platform) and social correction (by peer)
are equally eective in correcting misinformation and call for cam-
paigns to encourage users to refuse false or misleading information. The
same authors have shown how expert organization can correct mis-
information without damaging its credibility, presenting an appealing
intervention to reduce misinformation spread (Vraga and Bode, 2017).
Finally, there is a need to characterise the scale and nature of the
phenomenon much better, for example with studies of which socio-
demographic characteristics make social media users more susceptible
to and therefore likely to share health-related misinformation.
4.3. Limitations
Before concluding, we will note several limitations of the systematic
review. First, although we have attempted to dene the phenomenon
we are studying, our search strategy may not capture the terminology
used by others. This is not just a problem of language. There are many
related phenomena, such as denialism, groupthink, fearmongering, and
equivalents in other languages, such as Lügenpresse (lying press) in
German and it is possible that these or others may be used, in some
circumstances, to describe some elements of what we are studying.
Second, even when we agree the terms, such as misinformation and
fake news, the meanings adopted by authors can vary. Third, as noted
at the outset, it is very dicult to ascertain the motives of those
spreading particular rumours and myths, leaving us unable to answer
the old question mad or bad?. Fourth, while our focus has been on
messages concerning health-related issues, misinformation about other
issues can have health consequences. For instance, a man from North
Carolina travelled to Washington in 2016 and opened re at a pizzeria
following the spread of what became termed the Pizzagate theory,
whereby it was alleged that the pizzeria was the site of a paedophile
ring organised by Democratic Party leaders. Even though
comprehensively debunked, subsequent polls showed that this allega-
tion was still widely believed. Finally, since we excluded articles that
are not published in English, we may have omitted relevant papers
published in other languages.
5. Conclusion
Social media platforms, although providing immense opportunities
for people to engage with each other in ways that are benecial, also
allow misinformation to ourish. Without ltering or fact-checking,
these online platforms enable communities of denialists to thrive, for
instance by feeding into each other's feelings of persecution by a corrupt
elite (McKee and Diethelm, 2010). The accumulation of individual
beliefs in these unfounded stories, conspiracy theories, and pseu-
doscience can give rise to social movements, such as the anti-vaccina-
tion movement, with profound consequences for public health. This is
further exacerbated by the fact that it is politically incorrect to question
or criticize the belief of others, and the ght for truth is nevertheless
against the ow of true believers armed with ignorance and mis-
information (Kaufman et al., 2018).
We have shown that academic literature on this social phenomenon
mainly revolves around vaccination and infectious disease, drawing on
various disciplines, frameworks and empirical methods. Among the
articles examined, there is broad consensus that misinformation is
highly prevalent on social media and tends to be more popular than
accurate information, while its narrative often induces fear, anxiety and
mistrust in institutions. The severity and the deleterious eects it may
pose on the society is hardly quantiable, but evidence abounds that we
need more research on the identication of susceptible populations, and
on the understanding of socio-demographic and ideological asymme-
tries in the intention to spread misinformation.
Finally, since the persistence of misinformation owes both to the
psychological responses and to the social contexts under which mis-
information spread, potential interventions should target both fronts. At
the individual level, although interventions to correct misperceptions
are proven eective at times, eorts to retract misinformation need to
be carried out with caution in order to prevent backring. This requires
profound understanding on how epistemic and ideology beliefs act as
obstacles to accepting scientic evidence. A more constructive ap-
proach may be to cultivate critical thinking and to improve health and
media literacy, thereby equipping individuals with the faculty to criti-
cally assess the credibility of information. At the system level, how we
can amend our information ecosystem to reduce selective exposure and
opinion polarization is not a challenge for academics and policy-makers
alone to face. We therefore hope that our review can stimulate social
scientists, psychologists, computer scientist and medical professionals
to not only collaborate with each other, but also engage with industries
and internet consumers to understand and counter the eects of this
increasingly important social phenomenon.
Authors' contributions
YW collected the data, performed the review and drafted the
manuscript. All authors contributed to the interpretation, writing and
editing of the manuscript.
Acknowledgements
YW is funded by the EUs Research and Innovation Horizon 2020
Framework under grant agreement 721402.
DS is funded by a Wellcome Trust Investigator Award and ERC
HRES 313590.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
8
Appendix Table
Characteristics of Included Studies
Health Issue (by category) Setting/Geographic Focus Theory/Framework Study Design Platforms Authors
Communicable Disease
Ebola Nigeria N/A Survey Facebook, Twitter Adebimpe
et al. (2015)
Ebola Chinese and English-speaking
Countries
N/A Content Analysis (Social
Media)
Sina Weibo, Twitter Fung et al.
(2016)
Ebola US Epidemiological Models Content Analysis (Twitter);
Epidemiological Modelling
Twitter Jin et al.
(2014)
Ebola English-speaking Countries N/A Content Analysis (Video) YouTube Pathak et al.
(2015)
Inuenza China Rumour Theory
(Psychology)
Content Analysis (Social
Media)
Sina Weibo, Tencent Weibo Chen et al.
(2018a,b)
Middle East Respiratory S-
yndrome
South Korea N/A Social Network Analysis Twitter, Blogs, Online Community Text,
Discussion Board Text, News sites
Song et al.
(2017)
Nile Virus Infection English-speaking Countries N/A Content Analysis (Video) YouTube Dubey et al.
(2014)
Vaccination (general) English-speaking Countries N/A Content Analysis (Video) YouTube Basch et al.
(2017)
Vaccination (general) Italy N/A Content Analysis (Video) YouTube Donzelli et al.
(2018)
Vaccination (general) US Situational Theory of
Publics (Public
Relations)
Survey Websites Krishna
(2017)
Vaccination (general) Spain N/A Content Analysis (Twitter) Twitter Porat et al.
(2018)
Vaccination (general) English-speaking Countries Network Theory Social Network Analysis Facebook Schmidt et al.
(2018)
Vaccination (general) Canada N/A Content Analysis (Facebook) Facebook Instagram Twitter YouTube Tustin et al.
(2018)
Vaccination (general) English-speaking Countries N/A Sentiment Analysis Google Xu and Guo
(2018)
Vaccination (general) +
GMOs
US Psychological Theory Experiment Facebook Bode and
Vraga (2015)
Vaccination (HPV) US Theory of Rhetorical
Situation (Philosophy)
Qualitative Case Study
(Anthropology)
Websites Grant et al.
(2015)
Vaccination (HPV) US N/A Content Analysis (Social
Media)
Twitter Mahoney
et al. (2015)
Vaccination (inuenza) Italy N/A Content Analysis (Website) Websites, Blogs Panatto et al.
(2018)
Vaccination (MMR) Italy N/A Content Analysis (Social
Media)
Google Trend, Twitter, Facebook; National
Institute of Health (Istituto Superiore di
Sanità)
Aquino et al.
(2017)
Vaccination (MMR) US Network Theory Content Analysis (Twitter);
Social Network Analysis
Twitter Radzikowski
et al. (2016)
Vaccination (Polio and H-
PV)
Pakistan, US N/A Sentiment Analysis Mainstream Media, Twitter, Vaccine
Sentimeter
Bahk et al.
(2016)
Zika Virus US Psychological Theory Experiment Facebook, Twitter Bode and
Vraga (2018)
Zika virus English-speaking Countries Inoculation Theory
(Psychology)
Content Analysis (Video) YouTube Bora et al.
(2018)
Zika Virus Global N/A Content Analysis (Twitter);
Social Network Analysis
Twitter Ghenai and
Mejova
(2017)
Zika Virus Global N/A Content Analysis (Image) Instagram icker Seltzer et al.
(2017)
Zika Virus US N/A Content Analysis (Social
Media)
Facebook Sharma et al.
(2017)
Zika Virus English-speaking Countries Network Theory Social Network Analysis Twitter Sicilia et al.
(2017)
Zika Virus English-speaking Countries N/A Content Analysis (Weibo) Facebook, Twitter, LinkedIn, Pinterest,
GooglePlus, Web links
Sommariva
et al. (2018)
Zika Virus US Psychological Theory Experiment Twitter Vraga and
Bode (2017)
Zika Virus English-speaking Countries Network Theory Content Analysis (Twitter);
Social Network Analysis
Twitter Wood (2018)
Chronic Non-communicable Disease
Cancer
Cancer + Diet and Nutrit-
ion
US Rumour Theory
(Psychology)
Experiment Websites Chua and
Banerjee
(2018)
Cancer Screening Japan N/A Content Analysis (Website) Websites, Blogs, Facebook Okuhara et al.
(2017)
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
9
Gynaecologic Cancer China N/A Content Analysis (Social
Media)
Sina Weibo Tencent Weibo Chen et al.
(2018a,b)
Cardiovascular Disease
Diabetes + Diet India N/A Content Analysis (Video) YouTube Leong et al.
(2017)
Heart Failure English-speaking Countries N/A Content Analysis (Video) YouTube Chen et al.
(2013)
Hypertension English-speaking Countries N/A Content Analysis (Video) YouTube Kumar et al.
(2014)
Other Chronic Non-communicable Disease
Inammatory Bowel Dise-
ase
English-speaking Countries N/A Survey Facebook, Twitter, YouTube, Website Groshek et al.
(2017)
Psoriasis + Diet English-speaking countries N/A Content Analysis (Video) YouTube Qi et al.
(2016)
Others
Diet and Nutrition
Anorexia English, Spanish, Italian and
Portuguese-speaking
Countries
N/A Content Analysis (Video) YouTube, Vimeo and Voeh Syed-Abdul
et al. (2013)
Diet and Health-related In-
formation
Arabic-speaking Countries N/A Content Analysis (Twitter) Twitter Alnemer et al.
(2015)
Diet and Health-related R-
umour
Italy Network Theory Content Analysis (Social
Media); Social Network
Analysis
Facebook, Twitter and YouTube Bessi et al.
(2015)
Smoking
e-cigarette US (Chicago) Network Theory Content Analysis (Twitter);
Social Network Analysis
Twitter Harris et al.
(2014)
Hookah Tobacco Smoking US N/A Content Analysis (Website) Websites, Facebook, MySpace, Twitter Primack et al.
(2012)
Tobacco US N/A Experiment YouTube Albarracin
et al. (2018)
Water Safety/Quality
Water Fluoridation US N/A Content Analysis (Social
Media)
Facebook, Twitter, YouTube, Website Mertz and
Allukian
(2014)
Water Fluoridation US Network Theory Social Network Analysis;
Sentiment Analysis
Facebook, Websites Seymour et al.
(2015)
General Health (Rumour Psychology)
Health-related Rumours Southeast Asia Epistemology
(Philosophy)
Experiment Websites Chua and
Banerjee
(2017)
Health-related Rumours US Rumour Theory
(Psychology)
Experiment Twitter Li and
Sakamoto
(2015)
Health-related Rumours China N/A Content Analysis (Wechat) Wechat Li et al.
(2017)
Health-related Rumours US Psychological Theory Experiment Twitter, Websites Ozturk et al.
(2015)
Health-related Rumours Poland N/A Content Analysis (Social
Media)
Facebook, Twitter, LinkedIn, Pinterest Waszak et al.
(2018)
Miscellaneous
Abortion US N/A Content Analysis (Website) Websites Bryant et al.
(2014)
Dialysis English-speaking Countries N/A Content Analysis (Video) YouTube Garg et al.
(2015)
Drug Bahrain N/A Content Analysis
(WhatsApp)
WhatsApp Al Khaja et al.
(2018)
Multiple Sclerosis Italy N/A Content Analysis (Website) Websites Lavorgna
et al. (2018)
Paediatric
Tonsillectomy + Diet
English-speaking Countries N/A Content Analysis (Video) YouTube Strychowsky
et al. (2013)
Suicide China N/A Content Analysis (Weibo) Sina Weibo Li et al.
(2018)
References
Abbasi, M.-A., Liu, H., 2013. Measuring user credibility in social media. In: Greenberg,
A.M., Kennedy, W.G., Bos, N.D. (Eds.), Social Computing, Behavioral-Cultural
Modeling and Prediction. Springer Berlin Heidelberg, pp. 441448.
Abroms, L.C., Lee Westmaas, J., Bontemps-Jones, J., Ramani, R., Mellerson, J., 2013. A
content analysis of popular smartphone apps for smoking cessation. Am. J. Prev. Med.
45 (6), 732736. https://doi.org/10.1016/j.amepre.2013.07.008.
Adebimpe, W.O., Adeyemi, D.H., Faremi, A., Ojo, J.O., Efuntoye, A.E., 2015. The re-
levance of the social networking media in Ebola virus disease prevention and control
in Southwestern Nigeria. Pan Afr. Med. J. 22 (Suppl. 1). https://doi.org/10.11694/
pamj.supp.2015.22.1.6165.
Al Khaja, K.A.J., AlKhaja, A.K., Sequeira, R.P., 2018. Drug information, misinformation,
and disinformation on social media: a content analysis study. J. Public Health Policy.
https://doi.org/10.1057/s41271-018-0131-2.
Albarracin, D., Romer, D., Jones, C., Hall Jamieson, K., Jamieson, P., 2018. Misleading
claims about tobacco products in YouTube videos: experimental eects of mis-
information on unhealthy attitudes. J. Med. Internet Res. 20 (6), e229. https://doi.
org/10.2196/jmir.9959.
Allport, G.W., Postman, L., 1947. The psychology of rumor. In: The Psychology of Rumor.
Henry Holt, Oxford, England.
Alnemer, K.A., Alhuzaim, W.M., Alnemer, A.A., Alharbi, B.B., Bawazir, A.S., Barayyan,
O.R., Balaraj, F.K., 2015. Are Health-Related Tweets Evidence Based? Review and
Analysis of Health-Related Tweets on Twitter. Journal of Medical Internet Research
17 (10), e246. https://doi.org/10.2196/jmir.4898.
Aquino, F., Donzelli, G., De Franco, E., Privitera, G., Lopalco, P.L., Carducci, A., 2017. The
web and public condence in MMR vaccination in Italy. Vaccine 35 (35 Pt B),
44944498. https://doi.org/10.1016/j.vaccine.2017.07.029.
Bahk, C.Y., Cumming, M., Paushter, L., Mado, L.C., Thomson, A., Brownstein, J.S., 2016.
Publicly available online tool facilitates real-time monitoring of vaccine
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
10
conversations and sentiments. Health A. 35 (2), 341347. https://doi.org/10.1377/
hltha.2015.1092.
Bartlett, F.C., 1932. Remembering: a study in experimental and social psychology. In:
Remembering: A Study in Experimental and Social Psychology. Cambridge University
Press, New York, NY, US.
Basch, C.H., Zybert, P., Reeves, R., Basch, C.E., 2017. What do popular YouTubeTM vi-
deos say about vaccines? Child Care Health Dev. 43 (4), 499503. https://doi.org/10.
1111/cch.12442.
Bessi, A., Zollo, F., Vicario, M.D., Scala, A., Caldarelli, G., Quattrociocchi, W., 2015. Trend
of narratives in the age of misinformation. PLoS One 10 (8), e0134641. https://doi.
org/10.1371/journal.pone.0134641.
Bessi, A., Petroni, F., Vicario, M.D., Zollo, F., Anagnostopoulos, A., Scala, A.,
Quattrociocchi, W., 2016. Homophily and polarization in the age of misinformation.
Eur. Phys. J. Spec. Top. 225 (10), 20472059. https://doi.org/10.1140/epjst/e2015-
50319-0.
Betsch, C., Sachse, K., 2012. Dr. Jekyll or Mr. Hyde? (How) the Internet inuences vac-
cination decisions: recent evidence and tentative guidelines for online vaccine com-
munication. Vaccine 30 (25), 37233726. https://doi.org/10.1016/j.vaccine.2012.
03.078.
Bode, L., Vraga, E.K., 2015. In related news, that was wrong: the correction of mis-
information through related stories functionality in social media. J. Commun. 65 (4),
619638. https://doi.org/10.1111/jcom.12166.
Bode, L., Vraga, E.K., 2018. See something, say something: correction of global health
misinformation on social media. Health Commun. 33 (9), 11311140. https://doi.
org/10.1080/10410236.2017.1331312.
Bora, K., Das, D., Barman, B., Borah, P., 2018. Are internet videos useful sources of in-
formation during global public health emergencies? A case study of YouTube videos
during the 2015-16 Zika virus pandemic. Pathog. Glob. Health 112 (6), 320328.
https://doi.org/10.1080/20477724.2018.1507784.
Bordia, P., Difonzo, N., 2004. Problem solving in social interactions on the internet:
rumor as social cognition. Soc. Psychol. Q. 67 (1), 3349. https://doi.org/10.1177/
019027250406700105.
Boutyline, A., Willer, R., 2017. The social structure of political echo chambers: variation
in ideological homophily in online networks. Political Psychol. 38 (3), 551569.
https://doi.org/10.1111/pops.12337.
Bryant, A.G., Narasimhan, S., Bryant-Comstock, K., Levi, E.E., 2014. Crisis pregnancy
center websites: information, misinformation and disinformation. Contraception 90
(6), 601605. https://doi.org/10.1016/j.contraception.2014.07.003.
Castillo, C., Mendoza, M., Poblete, B., 2011. Information credibility on twitter. In:
Proceedings of the 20th International Conference on World Wide Web, pp. 675684.
https://doi.org/10.1145/1963405.1963500.
Chen, H.-M., Hu, Z.-K., Zheng, X.-L., Yuan, Z.-S., Xu, Z.-B., Yuan, L.-Q., Liao, X.-B., 2013.
Eectiveness of YouTube as a source of medical information on heart transplantation.
Interactive J. Med. Res. 2 (2), e28. https://doi.org/10.2196/ijmr.2669.
Chen, B., Shao, J., Liu, K., Cai, G., Jiang, Z., Huang, Y., et al., 2018a. Does eating chicken
feet with pickled peppers cause Avian inuenza? Observational case study on Chinese
social media during the Avian inuenza a (H7N9) outbreak. JMIR Publ. Health Surv.
4 (1), e32. https://doi.org/10.2196/publichealth.8198.
Chen, L., Wang, X., Peng, T.-Q., 2018b. Nature and diusion of gynecologic cancer-re-
lated misinformation on social media: analysis of tweets. J. Med. Internet Res. 20
(10), e11515. https://doi.org/10.2196/11515.
Chew, C., Eysenbach, G., 2010. Pandemics in the age of twitter: content analysis of tweets
during the 2009 H1N1 outbreak. PLoS One 5 (11), e14118. https://doi.org/10.1371/
journal.pone.0014118.
Chua, A.Y.K., Banerjee, S., 2017. To share or not to share: the role of epistemic belief in
online health rumors. Int. J. Med. Inform. 108, 3641. https://doi.org/10.1016/j.
ijmedinf.2017.08.010.
Chua, A.Y.K., Banerjee, S., 2018. Intentions to trust and share online health rumors: an
experiment with medical professionals. Comput. Hum. Behav. 87, 19. https://doi.
org/10.1016/j.chb.2018.05.021.
Datta, S.S., O'Connor, P.M., Jankovic, D., Muscat, M., Ben Mamou, M.C., Singh, S., Butler,
R., 2017. Progress and challenges in measles and rubella elimination in the WHO
European Region. Vaccine. https://doi.org/10.1016/j.vaccine.2017.06.042.
Del Vicario, M.D., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., Quattrociocchi,
W., 2016. The spreading of misinformation online. Proc. Natl. Acad. Sci. 113 (3),
554559. https://doi.org/10.1073/pnas.1517441113.
Donzelli, G., Palomba, G., Federigi, I., Aquino, F., Cioni, L., Verani, M., Lopalco, P., 2018.
Misinformation on vaccination: a quantitative analysis of YouTube videos. Hum.
Vaccines Immunother. 14 (7), 16541659. https://doi.org/10.1080/21645515.2018.
1454572.
Dubey, D., Amritphale, A., Sawhney, A., Dubey, D., Srivastav, N., 2014. Analysis of
YouTube as a Source of Information for West Nile Virus Infection. Clinical Medicine &
Research 12 (34), 129132. https://doi.org/10.3121/cmr.2013.1194.
Eng, D.S., Lee, J.M., 2013. The promise and peril of mobile health applications for dia-
betes and endocrinology. Pediatr. Diabetes 14 (4), 231238. https://doi.org/10.
1111/pedi.12034.
Filia, A., Bella, A., Del Manso, M., Baggieri, M., Magurano, F., Rota, M.C., 2017. Ongoing
outbreak with well over 4,000 measles cases in Italy from January to end August
2017 what is making elimination so dicult? Euro Surveill. 22 (37). https://doi.
org/10.2807/1560-7917.ES.2017.22.37.30614.
Free, C., Phillips, G., Galli, L., Watson, L., Felix, L., Edwards, P., Haines, A., 2013. The
eectiveness of mobile-health technology-based health behaviour change or disease
management interventions for health care consumers: a systematic review. PLoS Med.
10 (1), e1001362. https://doi.org/10.1371/journal.pmed.1001362.
Fung, I.C.-H., Fu, K.-W., Chan, C.-H., Chan, B.S.B., Cheung, C.-N., Abraham, T., Tse,
Z.T.H., 2016. Social media's initial reaction to information and misinformation on
Ebola, August 2014: facts and rumors. Public Health Rep. (Washington D. C: 1974)
131 (3), 461473. https://doi.org/10.1177/003335491613100312.
Garg, N., Venkatraman, A., Pandey, A., Kumar, N., 2015. YouTube as a source of in-
formation on dialysis: a content analysis. Nephrology (Carlton, Vic.) 20 (5), 315320.
https://doi.org/10.1111/nep.12397.
Ghenai, A., Mejova, Y., 2017. Catching Zika Fever: Application of Crowdsourcing and
Machine Learning for Tracking Health Misinformation on Twitter. ArXiv:1707.03778
[Cs]. Retrieved from. http://arxiv.org/abs/1707.03778.
Grant, L., Hausman, B.L., Cashion, M., Lucchesi, N., Patel, K., Roberts, J., 2015.
Vaccination persuasion online: a qualitative study of two provaccine and two vac-
cine-skeptical websites. J. Med. Internet Res. 17 (5), e133. https://doi.org/10.2196/
jmir.4153.
Groshek, J., Basil, M., Guo, L., Parker Ward, S., Farraye, F.A., Reich, J., 2017. Media
consumption and creation in attitudes toward and knowledge of inammatory bowel
disease: web-based survey. J. Med. Internet Res. 19 (12), e403. https://doi.org/10.
2196/jmir.7624.
Grunig, J., 1997. A Situational Theory of Publics: conceptual history, recent challenges
and new research. In: Grunig, J. (Ed.), Public Relations Research: an International
Perspective, pp. 347. Retrieved from. https://succeed.stir.ac.uk/bbcswebdav/
courses/PREP86_201314_Autumn_A/Digitised_extracts/PREP86GrunigSituational.
pdf.
Harris, J.K., Moreland-Russell, S., Choucair, B., Mansour, R., Staub, M., Simmons, K.,
2014. Tweeting for and against public health policy: response to the Chicago
Department of Public Health's electronic cigarette Twitter campaign. J. Med. Internet
Res. 16 (10), e238. https://doi.org/10.2196/jmir.3622.
He, Z., Cai, Z., Wang, X., 2015. Modeling propagation dynamics and developing opti-
mized countermeasures for rumor spreading in online social networks. In: 2015 IEEE
35th International Conference on Distributed Computing Systems, pp. 205214.
https://doi.org/10.1109/ICDCS.2015.29.
House of Commons Disinformation and fake News:nal report published - news from
parliament. Retrieved March 27, 2019, from UK Parliament Website. https://www.
parliament.uk/business/committees/committees-a-z/commons-select/digital-
culture-media-and-sport-committee/news/fake-news-report-published-17-19/.
Jerit, J., Barabas, J., 2012. Partisan perceptual bias and the information environment. J.
Politics 74 (3), 672684. https://doi.org/10.1017/s0022381612000187.
Jin, F., Wang, W., Zhao, L., Dougherty, E., Cao, Y., Lu, C.T., Ramakrishnan, N., 2014.
Misinformation propagation in the age of twitter. Computer 47 (12), 9094. https://
doi.org/10.1109/MC.2014.361.
Johnson, T.J., Kaye, B.K., 1998. Cruising is believing?: comparing internet and traditional
sources on media credibility measures. Journal. Mass Commun. Q. 75 (2), 325340.
https://doi.org/10.1177/107769909807500208.
Jost, J.T., van der Linden, S., Panagopoulos, C., Hardin, C.D., 2018. Ideological asym-
metries in conformity, desire for shared reality, and the spread of misinformation.
Curr. Opin. Psychol. 23, 7783. https://doi.org/10.1016/j.copsyc.2018.01.003.
Karlova, N., Fisher, K.E., 2013. A social diusion model of misinformation and disin-
formation for understanding human information behaviour. Inf. Res. 18.
Kata, A., 2012. Anti-vaccine activists, Web 2.0, and the postmodern paradigm an
overview of tactics and tropes used online by the anti-vaccination movement.
Vaccine 30 (25), 37783789. https://doi.org/10.1016/j.vaccine.2011.11.112.
Kaufman, A.B., Kaufman, J.C., Barnett, P.J., 2018. Pseudoscience: the Conspiracy against
Science. MIT Press.
Kim, D., Johnson, T.J., 2009. A shift in media credibility: comparing internet and tradi-
tional news sources in South Korea. Int. Commun. Gaz. 71 (4), 283302. https://doi.
org/10.1177/1748048509102182.
Kirkpatrick, C., 1932. A tentative study in experimental social psychology. Am. J. Sociol.
38 (2), 194206.
Krishna, A., 2017. Motivation with misinformation: conceptualizing lacuna individuals
and publics as knowledge-decient, issue-negative activists. J. Public Relat. Res. 29
(4), 176193. https://doi.org/10.1080/1062726X.2017.1363047.
Kumar, N., Pandey, A., Venkatraman, A., Garg, N., 2014. Are video sharing Web sites a
useful source of information on hypertension? J. Am. Soc. Hypertens. 8 (7), 481490.
https://doi.org/10.1016/j.jash.2014.05.001.
Lavorgna, L., De Stefano, M., Sparaco, M., Moccia, M., Abbadessa, G., Montella, P.,
Bonavita, S., 2018. Fake news, inuencers and health-related professional partici-
pation on the Web: a pilot study on a social-network of people with Multiple
Sclerosis. Multiple Scler. Relat. Disorders 25, 175178. https://doi.org/10.1016/j.
msard.2018.07.046.
Lazer, D.M.J., Baum, M.A., Benkler, Y., Berinsky, A.J., Greenhill, K.M., Menczer, F.,
Zittrain, J.L., 2018. The science of fake news. Science 359 (6380), 10941096.
https://doi.org/10.1126/science.aao2998.
Leong, A.Y., Sanghera, R., Jhajj, J., Desai, N., Jammu, B.S., Makowsky, M.J., 2017. Is
YouTube useful as a source of health information for adults with type 2 diabetes? A
South Asian perspective. Can. J. Diabetes 0 (0). https://doi.org/10.1016/j.jcjd.2017.
10.056.
Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., Schwarz, N., Cook, J., 2012.
Misinformation and its correction: continued inuence and successful debiasing.
Psychol. Sci. Public Interest 13 (3), 106131. https://doi.org/10.1177/
1529100612451018.
Lewandowsky, S., Gignac, G.E., Oberauer, K., 2013. The role of conspiracist ideation and
worldviews in predicting rejection of science. PLoS One 8 (10), e75637. https://doi.
org/10.1371/journal.pone.0075637.
Li, H., Sakamoto, Y., 2015. Computing the veracity of information through crowds: a
method for reducing the spread of false messages on social media. In: 2015 48th
Hawaii International Conference on System Sciences, pp. 20032012. https://doi.
org/10.1109/HICSS.2015.240.
Li, A., Huang, X., Jiao, D., O'Dea, B., Zhu, T., Christensen, H., 2018. An analysis of stigma
and suicide literacy in responses to suicides broadcast on social media. Asia Pac.
Psychiatr. Oc. J.e Pac. Rim Coll. Psychiatr. 10 (1). https://doi.org/10.1111/appy.
12314.
Li, Y., Zhang, X., Wang, S., 2017. Fake vs. Real health information in social media in
China. Proceedings of the Association for Information Science and Technology 54 (1),
742743. https://doi.org/10.1002/pra2.2017.14505401139.
Lupia, A., 2013. Communicating science in politicized environments. Proc. Natl. Acad.
Sci. 110 (Suppl. 3), 1404814054. https://doi.org/10.1073/pnas.1212726110.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
11
Mahoney, L.M., Tang, T., Ji, K., Ulrich-Schad, J., 2015. The Digital Distribution of Public
Health News Surrounding the Human Papillomavirus Vaccination: A Longitudinal
Infodemiology Study. JMIR Public Health and Surveillance 1 (1), e2. https://doi.org/
10.2196/publichealth.3310.
McKee, M., Diethelm, P., 2010. How the growth of denialism undermines public health.
BMJ 341, c6950. https://doi.org/10.1136/bmj.c6950.
McKernon, E., 1925, October. Fake News and the Public. Harper's Magazine. Retrieved
from. https://harpers.org/archive/1925/10/fake-news-and-the-public/.
Mertz, A., Allukian, M., 2014. Community water uoridation on the Internet and social
media. Journal of the Massachusetts Dental Society 63 (2), 3236.
Metzger, M.J., 2007. Making sense of credibility on the Web: models for evaluating online
information and recommendations for future research. J. Am. Soc. Inf. Sci. Technol.
58 (13), 20782091. https://doi.org/10.1002/asi.20672.
Metzger, M.J., Flanagin, A.J., Eyal, K., Lemus, D.R., Mccann, R.M., 2003. Credibility for
the 21st century: integrating perspectives on source, message, and media credibility
in the contemporary media environment. Ann. Int. Commun. Assoc. 27 (1), 293335.
https://doi.org/10.1080/23808985.2003.11679029.
Metzger, M.J., Flanagin, A.J., Medders, R.B., 2010. Social and heuristic approaches to
credibility evaluation online. J. Commun. 60 (3), 413439. https://doi.org/10.1111/
j.1460-2466.2010.01488.x.
Moher, D., Liberati, A., Tetzla, J., Altman, D.G., Group, T.P., 2009. Preferred reporting
items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 6
(7), e1000097. https://doi.org/10.1371/journal.pmed.1000097.
Murray, E., Khadjesari, Z., White, I.R., Kalaitzaki, E., Godfrey, C., McCambridge, J.,
Wallace, P., 2009. Methodological challenges in online trials. J. Med. Internet Res. 11
(2). https://doi.org/10.2196/jmir.1052.
Nicholson, M.S., Leask, J., 2012. Lessons from an online debate about measles-
mumpsrubella (MMR) immunization. Vaccine 30 (25), 38063812. https://doi.org/
10.1016/j.vaccine.2011.10.072.
Noack, A., 2009. Modularity clustering is force-directed layout. Phys. Rev. E 79 (2),
026102. https://doi.org/10.1103/PhysRevE.79.026102.
Nyhan, B., Reier, J., 2010. When corrections fail: the persistence of political mis-
perceptions. Political Behav. 32 (2), 303330. https://doi.org/10.1007/s11109-010-
9112-2.
Okuhara, T., Ishikawa, H., Okada, M., Kato, M., Kiuchi, T., 2017. Assertions of Japanese
websites for and against cancer screening: a text mining analysis. Asian Pac. J. Cancer
Prev. APJCP: Asian Pac. J. Cancer Prev. APJCP 18 (4), 10691075. https://doi.org/
10.22034/APJCP.2017.18.4.1069.
Ozturk, P., Li, H., Sakamoto, Y., 2015. Combating rumor spread on Social Media: the
eectiveness of refutation and warning. In: 2015 48th Hawaii International
Conference on System Sciences, pp. 24062414. https://doi.org/10.1109/HICSS.
2015.288.
Panatto, D., Amicizia, D., Arata, L., Lai, P.L., Gasparini, R., 2018. A comprehensive
analysis of Italian web pages mentioning squalene-based inuenza vaccine adjuvants
reveals a high prevalence of misinformation. Hum. Vaccines Immunother. 14 (4),
969977. https://doi.org/10.1080/21645515.2017.1407483.
Pathak, R., Poudel, D.R., Karmacharya, P., Pathak, A., Aryal, M.R., Mahmood, M.,
Donato, A.A., 2015. YouTube as a source of information on Ebola virus disease. N.
Am. J. Med. Sci. 7 (7), 306309. https://doi.org/10.4103/1947-2714.161244.
Pennycook, G., Rand, D.G., 2018. Lazy, not biased: susceptibility to partisan fake news is
better explained by lack of reasoning than by motivated reasoning. Cognition.
https://doi.org/10.1016/j.cognition.2018.06.011.
Porat, T., Garaizar, P., Ferrero, M., Jones, H., Ashworth, M., Vadillo, M.A., 2018. Content
and source analysis of popular tweets following a recent case of diphtheria in Spain.
Eur. J. Public Health. https://doi.org/10.1093/eurpub/cky144.
Primack, B.A., Rice, K.R., Shensa, A., Carroll, M.V., DePenna, E.J., Nakkash, R., Barnett,
T.E., 2012. U.S. hookah tobacco smoking establishments advertised on the internet.
American Journal of Preventive Medicine 42 (2), 150156. https://doi.org/10.1016/
j.amepre.2011.10.013.
Qi, J., Trang, T., Doong, J., Kang, S., Chien, A.L., 2016. Misinformation is prevalent in
psoriasis-related YouTube videos. Dermatol. Online J. 22 (11).
Radzikowski, J., Stefanidis, A., Jacobsen, K.H., Croitoru, A., Crooks, A., Delamater, P.L.,
2016. The measles vaccination narrative in twitter: a quantitative analysis. JMIR
Publ. Health Surv. 2 (1), e1. https://doi.org/10.2196/publichealth.5059.
Reyna, V.F., 2012. Risk perception and communication in vaccination decisions: a fuzzy-
trace theory approach. Vaccine 30 (25), 37903797. https://doi.org/10.1016/j.
vaccine.2011.11.070.
Ritterband, L.M., Tate, D.F., 2009. The science of internet interventions. Introduction.
Ann. Behav. Med. Publ. Soc. Behav. Med. 38 (1), 13. https://doi.org/10.1007/
s12160-009-9132-5.
Rosnow, R.L., 1991. Inside rumor: a personal journey. Am. Psychol. 46 (5), 484496.
https://doi.org/10.1037/0003-066X.46.5.484.
Salathé, M., Khandelwal, S., 2011. Assessing vaccination sentiments with online social
media: implications for infectious disease dynamics and control. PLoS Comput. Biol.
7 (10), e1002199. https://doi.org/10.1371/journal.pcbi.1002199.
Scanfeld, D., Scanfeld, V., Larson, E.L., 2010. Dissemination of health information
through social networks: twitter and antibiotics. Am. J. Infect. Contr. 38 (3),
182188. https://doi.org/10.1016/j.ajic.2009.11.004.
Schmidt, A.L., Zollo, F., Scala, A., Betsch, C., Quattrociocchi, W., 2018. Polarization of the
vaccination debate on Facebook. Vaccine 36 (25), 36063612. https://doi.org/10.
1016/j.vaccine.2018.05.040.
Seltzer, E.K., Horst-Martz, E., Lu, M., Merchant, R.M., 2017. Public sentiment and dis-
course about Zika virus on Instagram. Public Health 150, 170175. https://doi.org/
10.1016/j.puhe.2017.07.015.
Serrano, E., Iglesias, C.Á., Garijo, M., 2015. A novel agent-based rumor spreading model
in twitter. In: Proceedings of the 24th International Conference on World Wide Web,
pp. 811814. https://doi.org/10.1145/2740908.2742466.
Seymour, B., Getman, R., Saraf, A., Zhang, L.H., Kalenderian, E., 2015. When advocacy
obscures accuracy online: digital pandemics of public health misinformation through
an antiuoride case study. Am. J. Public Health 105 (3), 517523. https://doi.org/
10.2105/AJPH.2014.302437.
Shao, C., Hui, P.-M., Wang, L., Jiang, X., Flammini, A., Menczer, F., Ciampaglia, G.L.,
2018. Anatomy of an online misinformation network. PLoS One 13 (4), e0196087.
https://doi.org/10.1371/journal.pone.0196087.
Sharma, M., Yadav, K., Yadav, N., Ferdinand, K.C., 2017. Zika virus pandemic-analysis of
Facebook as a social media health information platform. Am. J. Infect. Contr. 45 (3),
301302. https://doi.org/10.1016/j.ajic.2016.08.022.
Shin, D.-H., Lee, S., Hwang, Y., 2017. How do credibility and utility play in the user
experience of health informatics services? Comput. Hum. Behav. 67 (C), 292302.
https://doi.org/10.1016/j.chb.2016.11.007.
Sicilia, R., Giudice, S.L., Pei, Y., Pechenizkiy, M., Soda, P., 2017. Health-related rumour
detection on Twitter. In: 2017 IEEE International Conference on Bioinformatics and
Biomedicine. BIBM, pp. 15991606. https://doi.org/10.1109/BIBM.2017.8217899.
Signorini, A., Segre, A.M., Polgreen, P.M., 2011. The use of twitter to track levels of
disease activity and public concern in the U.S. During the inuenza a H1N1 pan-
demic. PLoS One 6 (5), e19467. https://doi.org/10.1371/journal.pone.0019467.
Sommariva, S., Vamos, C., Mantzarlis, A., Đào, L.U.-L., Tyson, D.M., 2018. Spreading the
(fake) news: exploring health messages on social media and the implications for
health professionals using a case study. Am. J. Health Educ. 49 (4), 246255. https://
doi.org/10.1080/19325037.2018.1473178.
Song, J., Song, T.M., Seo, D.-C., Jin, D.-L., Kim, J.S., 2017. Social Big Data Analysis of
Information Spread and Perceived Infection Risk During the 2015 Middle East
Respiratory Syndrome Outbreak in South Korea. Cyberpsychology, Behavior and
Social Networking 20 (1), 2229. https://doi.org/10.1089/cyber.2016.0126.
Steinhubl, S.R., Muse, E.D., Topol, E.J., 2015. The emerging eld of mobile health. Sci.
Transl. Med. 7 (283) 283rv3-283rv3. https://doi.org/10.1126/scitranslmed.
aaa3487.
Stone, C.B., Wang, Q., 2018. From conversations to digital communication: the mnemonic
consequences of consuming and producing information via social media. Topics
Cognit. Sci. 0 (0). https://doi.org/10.1111/tops.12369.
Strychowsky, J.E., Nayan, S., Farrokhyar, F., MacLean, J., 2013. YouTube: a good source
of information on pediatric tonsillectomy? Int. J. Pediatr. Otorhinolaryngol. 77 (6),
972975. https://doi.org/10.1016/j.ijporl.2013.03.023.
Swire, Briony, Berinsky Adam, J., Stephan, Lewandowsky, Ecker Ullrich, K.H., 2017.
Processing political misinformation: comprehending the Trump phenomenon. R. Soc.
Open Sci. 4 (3), 160802. https://doi.org/10.1098/rsos.160802.
Syed-Abdul, S., Fernandez-Luque, L., Jian, W.-S., Li, Y.-C., Crain, S., Hsu, M.-H., Liou, D.-
M., 2013. Misleading health-related information promoted through video-based so-
cial media: anorexia on YouTube. J. Med. Internet Res. 15 (2), e30. https://doi.org/
10.2196/jmir.2237.
Taber, C.S., Lodge, M., 2006. Motivated skepticism in the evaluation of political beliefs.
Am. J. Pol. Sci. 50 (3), 755769. https://doi.org/10.1111/j.1540-5907.2006.
00214.x.
Tustin, J.L., Crowcroft, N.S., Gesink, D., Johnson, I., Keelan, J., Lachapelle, B., 2018.
User-driven comments on a Facebook advertisement recruiting Canadian parents in a
study on immunization: content analysis. JMIR Publ. Health Surv. 4 (3), e10090.
https://doi.org/10.2196/10090.
van Eck, N.J., Waltman, L., 2010. Software survey: VOSviewer, a computer program for
bibliometric mapping. Scientometrics 84 (2), 523538. https://doi.org/10.1007/
s11192-009-0146-3.
Verelst, F., Willem, L., Beutels, P., 2016. Behavioural change models for infectious disease
transmission: a systematic review (20102015). J. R. Soc. Interface 13 (125),
20160820. https://doi.org/10.1098/rsif.2016.0820.
Vosoughi, S., Roy, D., Aral, S., 2018. The spread of true and false news online. Science
359 (6380), 11461151. https://doi.org/10.1126/science.aap9559.
Vraga, E.K., Bode, L., 2017. Using Expert Sources to Correct Health Misinformation in
Social Media: Science Communication.https://doi.org/10.1177/1075547017731776.
Wakamiya, S., Kawai, Y., Aramaki, E., 2016. After the Boom No One Tweets: Microblog-
Based Inuenza Detection Incorporating Indirect Information. pp. 1725. https://doi.
org/10.1145/3007818.3007822 October 17.
Wakeeld, A.J., Murch, S.H., Anthony, A., Linnell, J., Casson, D.M., Malik, M., Walker-
Smith, J.A., 1998. RETRACTED: ileal-lymphoid-nodular hyperplasia, non-specic
colitis, and pervasive developmental disorder in children. The Lancet 351 (9103),
637641. https://doi.org/10.1016/S0140-6736(97)11096-0.
Wardle, C., 2017, February 16. Fake news. It's complicated. retrieved June 9, 2018, from
rst draft news website. https://rstdraftnews.org:443/fake-news-complicated/.
Wardle, C., Derakhshan, H., 2017, October 31. Information disorder: an Interdisciplinary
framework. Retrieved January 19, 2019, from rst draft news website. https://
rstdraftnews.org:443/coe-report/.
Waszak, P.M., Kasprzycka-Waszak, W., Kubanek, A., 2018. The spread of medical fake
news in social media the pilot quantitative study. Health Policy Technol. 7 (2),
115118. https://doi.org/10.1016/j.hlpt.2018.03.002.
Wood, M.J., 2018. Propagating and debunking conspiracy theories on twitter during the
2015-2016 Zika virus outbreak. Cyberpsychol., Behav. Soc. Netw. 21 (8), 485490.
https://doi.org/10.1089/cyber.2017.0669.
World Economic Forum, 2013. World Economic Forum Global Risks 2013 Eighth
Edition. Retrieved June 9, 2018, from Global Risks 2013 Website. http://wef.ch/
GJKqei.
Xu, Z., Guo, H., 2018. Using text mining to compare online pro- and anti-vaccine head-
lines: word usage, sentiments, and online popularity. Commun. Stud. 69 (1),
103122. https://doi.org/10.1080/10510974.2017.1414068.
Y. Wang, et al. Social Science & Medicine 240 (2019) 112552
12
... 24 A systematic review of health-related misinformation on social media found that it is mostly created by individuals with no official or institutional affiliations, although this was not specific to cancer. 30 Anecdotal reports, such as individual patient narratives, are deemphasized in evidence-based medicine but may be prominently featured on social media, regardless of whether they contain components that mischaracterize cancer-related issues. 31 Even if they are authentic, the experience of one individual may not be suitable for other patients who are seeking help. ...
... 32 Accurate scientific information, which may be boring or challenging to understand, may be less visible or interesting to the general public and patients with cancer than sensationalized news. 30 Even trusted sources, such as oncology providers, may have conflicts of interest (e.g., receipt of payments from a pharmaceutical company), which may or may not be readily apparent in postings on social networks. 33 Many studies have directly examined the accuracy of content about cancer-related topics on social networks and/or types of misinformation that are being shared. ...
Article
Full-text available
Social media is widely used globally by patients, families of patients, health professionals, scientists, and other stakeholders who seek and share information related to cancer. Despite many benefits of social media for cancer care and research, there is also a substantial risk of exposure to misinformation, or inaccurate information about cancer. Types of misinformation vary from inaccurate information about cancer risk factors or unproven treatment options to conspiracy theories and public relations articles or advertisements appearing as reliable medical content. Many characteristics of social media networks—such as their extensive use and the relative ease it allows to share information quickly—facilitate the spread of misinformation. Research shows that inaccurate and misleading health‐related posts on social media often get more views and engagement (e.g., likes, shares) from users compared with accurate information. Exposure to misinformation can have downstream implications for health‐related attitudes and behaviors. However, combatting misinformation is a complex process that requires engagement from media platforms, scientific and health experts, governmental organizations, and the general public. Cancer experts, for example, should actively combat misinformation in real time and should disseminate evidence‐based content on social media. Health professionals should give information prescriptions to patients and families and support health literacy. Patients and families should vet the quality of cancer information before acting upon it (e.g., by using publicly available checklists) and seek recommended resources from health care providers and trusted organizations. Future multidisciplinary research is needed to identify optimal ways of building resilience and combating misinformation across social media.
... Para mapear e caracterizar os estudos que tratam de fenómenos relacionados às audiências com foco em pessoas e grupos migrantes, este estudo, além de procurar incorporar as diretrizes do PRISMA (Page et al., 2021) aplicáveis a esta proposta, recorre também a contribuições teóricas e metodológicas de revisões semelhantes realizadas em diferentes subdomínios das Ciências da Comunicação (ver Nightingale, 2013;Jacks et al., 2014Jacks et al., , 2017Wang et al., 2019;Loecherbach et al., 2020;Santos & Miranda, 2022). Partindo destes modelos, esta RSL oferece uma avaliação autônoma de um conjunto de estudos reunidos por meio de uma criteriosa estratégia de busca a fim de, objetivamente, responder às seguintes questões: PI1 -Quais são as características, abordagens metodológicas, referencial teórico e procedimentos de investigação de estudos empíricos que investigam audiências com foco em migrantes internacionais? ...
Article
Full-text available
O interesse da pesquisa de media, em especial os estudos de audiência, por questões étnicas e raciais tem início particularmente a partir dos anos 1990. Desde então, um vasto arcabouço conceitual, informado principalmente pelos estudos da diáspora, passou a incorporar os protocolos de pesquisa da área tornando-se fundamental para compreender diversos aspetos atravessados por processos transnacionais. Com o objetivo de mapear estudos que, ao longo das últimas décadas, propuseram um exame empírico das audiências com foco em pessoas e grupos migrantes, o presente artigo procura entender como estes trabalhos foram desenvolvidos e quais as suas principais características e interesses. O presente texto relata os resultados de uma revisão sistemática da literatura cujos registros recuperados compreendem artigos publicados em periódicos com revisão por pares e indexados nas bases de dados Scopus, Web of Science e SciELO até janeiro de 2023. A partir das informações extraídas destas bases e da análise de conteúdo do texto completo, esta revisão identifica as principais características dos estudos incluídos, como datas de publicação, autores, periódicos, opções metodológicas, objetos, enfoques e abordagem da receção, aspetos regionais e as principais referências e fenómenos analisados.
... Most of the former studies [54][55][56][57] demonstrated the prevalence of disease and health related misinformation on different social media networks. Previous findings showed that misinformation had the tendency to avoid television, newspaper like established and traditional media while spreading. ...
Article
Full-text available
Myths, misinformation, facts like posts spread by social media during COVID-19 pandemic had an enormous effect on psychological health. This study aimed to investigate social media based COVID-19’s posts and the psychological health status of participants. A cross-sectional, online survey-based study was conducted in between April to October 2021 using a structured and semi-structured questionnaire, predominantly involving 1200 active social network users in Bangladesh. Depression, anxiety, and stress were assessed using the Depression, Anxiety, and Stress Scale (DASS-21), while the Insomnia Severity Index (ISI) measured insomnia severity for selected participants. Internal reliabilities were calculated with Cronbach’s alpha coefficients (cut-off point 0.70). Unrelated multivariate logistic regression explored correlations among outcome errors, with the model assessing the impact of selected independent variables on mental health. The findings demonstrated that 27.8% individuals spread facts whereas 7.4% spread myths and misinformation about COVID-19 on social networks. Furthermore, 28.1% and 36.7% shared obstinate and concerning posts respectively. The prevalence of depression, anxiety and stress symptoms, ranging from mild to extremely severe, were 43.9%, 30.9%, and 23.8% respectively. However, 2.8% had severe level of insomnia. Facts, myths, tour attending, and no mask group photos were significantly associated with anxiety, and less likelihood of experiencing anxiety. Interestingly, circulating such activities on social networks had no significant association with depression, stress, or insomnia. The spread of misinformation on social media undermines any efforts to contain COVID-19 infection. The findings hugely recommend of using fact checking facilities and adaptation to the pandemic situations to maintain lower prevalence of depression, anxiety, stress and insomnia.
... However, a majority of agent-based misinformation infection models rely on infection probabilities that are static for each user and for each topic of misinformation that is explored. In reality, the likelihood of information spread between social media users has a complex relationship to user preferences, user community, and the topic being discussed [30,31]. The lack of such dynamism in static infection models limits investigation of how countermeasure effectiveness varies in response to these variables. ...
Article
Full-text available
We develop a simulation framework for studying misinformation spread within online social networks that blends agent-based modeling and natural language processing techniques. While many other agent-based simulations exist in this space, questions over their fidelity and generalization to existing networks in part hinder their ability to drive policy-relevant decision making. To partially address these concerns, we create a ’digital clone’ of a known misinformation sharing network by downloading social media histories for over ten thousand of its users. We parse these histories to both extract the structure of the network and model the nuanced ways in which information is shared and spread among its members. Unlike many other agent-based methods in this space, information sharing between users in our framework is sensitive to topic of discussion, user preferences, and online community dynamics. To evaluate the fidelity of our method, we seed our cloned network with a set of posts recorded in the base network and compare propagation dynamics between the two, observing reasonable agreement across the twin networks over a variety of metrics. Lastly, we explore how the cloned network may serve as a flexible, low-cost testbed for misinformation countermeasure evaluation and red teaming analysis. We hope the tools explored here augment existing efforts in the space and unlock new opportunities for misinformation countermeasure evaluation, a field that may become increasingly important to consider with the anticipated rise of misinformation campaigns fueled by generative artificial intelligence.
... Indeed, online information about health can cover various aspects, including the formation of public opinion, effects on public discourse and agenda setting, interactions between doctors and patients, as well as influences on health behaviours in the short, medium, or long term. 28 Further research is required to identify vulnerable populations and gain a better understanding of sociodemographic and ideological factors influencing users' behaviour. Additionally, cultural differences in information consumption and behaviours must also be considered to develop targeted and effective interventions and mitigate the influence of health misinformation. ...
Article
Key messages: 1) Monitoring social media is important to understand public perceptions, biases, and false beliefs 2) Drawing conclusions on how social media affects health behaviour is difficult because measures are unstandardised, sources are limited, and data are incomplete and biased 3) Rigorous research is needed from varied settings and demographics to improve understanding of the effect of social media on health behaviour
Chapter
The massive development of digital technology accelerated the pace of disruption in almost every industry, creating immense ambiguity and continuing to accelerate uncertainty in the business environment. The essentiality for an organization to adapt to rapid digital transformation led businesses to remain competitive and relevant in the industry. With the rise of the pandemic Covid 19, creating more challenges for organizations in the effort to improve digital maturity. This study focuses on digital transformation and its impact on an organization's performance while narrowing down the research to the telecommunication industry in the Brunei Darussalam context. The contribution of this study is expected to fill the gap in the literature regarding antecedents of a successful digital transformation.
Article
Online video sharing platforms like YouTube (Google LLC, San Bruno, CA, USA) have become a substantial source of health information. We sought to conduct a systematic review of studies assessing the overall quality of perioperative anesthesia videos on YouTube. We searched Embase, MEDLINE, and Ovid for articles published from database inception to 1 May 2023. We included primary studies evaluating YouTube videos as a source of information regarding perioperative anesthesia. We excluded studies not published in English and studies assessing acute or chronic pain. Studies were screened and data were extracted in duplicate by two reviewers. We appraised the quality of studies according to the social media framework published in the literature. We used descriptive statistics to report the results using mean, standard deviation, range, and n/total N (%). Among 8,908 citations, we identified 14 studies that examined 796 videos with 59.7 hr of content and 47.5 million views. Among the 14 studies that evaluated the video content quality, 17 different quality assessment tools were used, only three of which were externally validated (Global Quality Score, modified DISCERN score, and JAMA score). Per global assessment rating of video quality, 11/13 (85%) studies concluded the overall video quality as poor. Overall, the educational content quality of YouTube videos evaluated in the literature accessible as an educational resource regarding perioperative anesthesia was poor. While these videos are in demand, their impact on patient and trainee education remains unclear. A standardized methodology for evaluating online videos is merited to improve future reporting. A peer-reviewed approach to online open-access videos is needed to support patient and trainee education in anesthesia. Open Science Framework (https://osf.io/ajse9); first posted, 1 May 2023.
Article
The text considers several critical issues related to the role of false information in the COVID-19 pandemic. It mainly focuses on social media, which often resemble echo chambers responsible for disseminating disinformation. In these echo chambers, users close themselves off from arguments and justifications different from their own, often with a strong tendency towards polarization of views and attitudes. A particular case of echo chambers is the conspiracy mentality propagated in social media, promoting conspirational beliefs about COVID-19, which, besides offering an alternative understanding of reality, deepens distrust towards epistemic authorities and methods of producing scientific knowledge. This indicates an epistemic crisis as a consequence of the pandemic, which must be addressed in order to rebuild and protect epistemic trust. The authors conclude that the consequence of this crisis is a regression of cognitive abilities, which may, in a feedback loop, exacerbate the epistemic crisis.
Article
Full-text available
The World Economic Forum listed massive digital misinformation as one of the main threats for our society. The spreading of unsubstantiated rumors may have serious consequences on public opinion such as in the case of rumors about Ebola causing disruption to health-care workers. In this work we target Facebook to characterize information consumption patterns of 1.2 M Italian users with respect to verified (science news) and unverified (conspiracy news) contents. Through a thorough quantitative analysis we provide important insights about the anatomy of the system across which misinformation might spread. In particular, we show that users’ engagement on verified (or unverified) content correlates with the number of friends having similar consumption patterns (homophily). Finally, we measure how this social system responded to the injection of 4,709 false information. We find that the frequent (and selective) exposure to specific kind of content (polarization) is a good proxy for the detection of homophile clusters where certain kind of rumors are more likely to spread.
Article
Full-text available
Background: Over the last two decades, the incidence and mortality rates of gynecologic cancers have increased at a constant rate in China. Gynecologic cancers have become one of the most serious threats to women's health in China. With the widespread use of social media, an increasing number of individuals have employed social media to produce, seek, and share cancer-related information. However, health information on social media is not always accurate. Health, and especially cancer-related, misinformation has been widely spread on social media, which can affect individuals' attitudinal and behavioral responses to cancer. Objective: The aim of this study was to examine the nature and diffusion of gynecologic cancer-related misinformation on Weibo, the Chinese equivalent of Twitter. Methods: A total of 2691 tweets related to 2 gynecologic cancers-breast cancer and cervical cancer-posted on Weibo from June 2015 to June 2016 were extracted using the Python Web Crawler. Two medical school graduate students with expertise in gynecologic diseases were recruited to code the tweets to differentiate between true information and misinformation as well as to identify the types of falsehoods. The diffusion characteristics of gynecologic cancer-related misinformation were compared with those of the true information. Results: While most of the gynecologic cancer-related tweets provided medically accurate information, approximately 30% of them were found to contain misinformation. Furthermore, it was found that tweets about cancer treatment contained a higher percentage of misinformation than prevention-related tweets. Nevertheless, the prevention-related misinformation diffused significantly more broadly and deeply than true information on social media. Conclusions: The findings of this study suggest the need for controlling and reducing the cancer-related misinformation on social media with the efforts from both service providers and medical professionals. More specifically, it is important to correct falsehoods related to the prevention of gynecologic cancers on social media and increase individuals' capacity to assess the veracity of Web-based information to curb the spread and thus minimize the consequences of cancer-related misinformation.
Article
Full-text available
The present study investigates the characteristics of discussion of conspiracy theories about the Zika virus outbreak of 2015-16 on Twitter. Content and social network analysis of a dataset of 25,162 original Tweets about Zika virus conspiracy theories showed that relative to debunking messages, conspiracy theories spread through a more decentralized network, are more likely to invoke supposedly knowledgeable authorities in making arguments, and ask more rhetorical questions. These trends can be understood in the context of previous work on conspiracy theories, including the "just asking questions" style of rhetoric, the importance of sourcing and authority, and the tendency to simultaneously consider many different potential conspiracies that might underlie an important topic or event.
Article
Full-text available
Background: More people are searching for immunization information online and potentially being exposed to misinformation and antivaccination sentiment in content and discussions on social media platforms. As vaccination coverage rates remain suboptimal in several developed countries, and outbreaks of vaccine-preventable diseases become more prevalent, it is important that we build on previous research by analyzing themes in online vaccination discussions, including those that individuals may see without actively searching for information on immunization. Objective: The study aimed to explore the sentiments and themes behind an unsolicited debate on immunization in order to better inform public health interventions countering antivaccination sentiment. Methods: We analyzed and quantified 117 user-driven open-ended comments on immunization posted in the Comments section of a Facebook advertisement that targeted Canadian parents for recruitment into a larger study on immunization. Then, 2 raters coded all comments using content analysis. Results: Of 117 comments, 85 were posted by unique commentators, with most being female (65/85, 77%). The largest proportion of the immunization comments were positive (51/117, 43.6%), followed by negative (41/117, 35.0%), ambiguous (20/117, 17.1%), and hesitant (5/117, 4.3%). Inaccurate knowledge (27/130, 20.8%) and misperceptions of risk (23/130, 17.7%) were most prevalent in the 130 nonpositive comments. Other claims included distrust of pharmaceutical companies or government agencies (18/130, 13.8%), distrust of the health care system or providers (15/130, 11.5%), past negative experiences with vaccination or beliefs (10/130, 7.7%), and attitudes about health and prevention (10/130, 7.7%). Almost 40% (29/74, 39%) of the positive comments communicated the risks of not vaccinating, followed by judgments on the knowledge level of nonvaccinators (13/74, 18%). A total of 10 positive comments (10/74, 14%) specifically refuted the link between autism and vaccination. Conclusions: The presence of more than 100 unsolicited user-driven comments on a platform not intended for discussion, nor providing any information on immunization, illustrates the strong sentiments associated with immunization and the arbitrariness of the online platforms used for immunization debates. Health authorities should be more proactive in finding mechanisms to refute misinformation and misperceptions that are propagating uncontested online. Online debates and communications on immunization need to be identified by continuous monitoring in order for health authorities to understand the current themes and trends, and to engage in the discussion.
Article
Full-text available
Social media has become one of the most powerful and ubiquitous means by which individuals curate, share, and communicate information with their friends, family, and the world at large. Indeed, 90% of the American adolescents are active social media users, as well as 65% of American adults (Perrin, 2015; see also Duggan & Brenner, 2013). Despite this, psychologists are only beginning to understand the mnemonic consequences associated with social media use. In this article, we will distill this nascent literature by focusing on two primary factors: the type of information (personal vs. public) and the role (producer vs. consumer) individuals play when engaging with social media. In particular, we will highlight research examining induced forgetting for personal information as well as false memories and truthiness for public information. We will end by providing some tentative conclusions and a discussion of areas in need of additional research that will provide a more holistic understanding of the mnemonic consequences associated with social media use.
Book
In a post-truth, fake news world, we are particularly susceptible to the claims of pseudoscience. When emotions and opinions are more widely disseminated than scientific findings, and self-proclaimed experts get their expertise from Google, how can the average person distinguish real science from fake? This book examines pseudoscience from a variety of perspectives, through case studies, analysis, and personal accounts that show how to recognize pseudoscience, why it is so widely accepted, and how to advocate for real science. Contributors examine the basics of pseudoscience, including issues of cognitive bias; the costs of pseudoscience, with accounts of naturopathy and logical fallacies in the anti-vaccination movement; perceptions of scientific soundness; the mainstream presence of "integrative medicine," hypnosis, and parapsychology; and the use of case studies and new media in science advocacy. © 2018 Massachusetts Institute of Technology. All rights reserved.
Article
Background: Internet-videos, though popular sources of public health information, are often unverified and anecdotal. We critically evaluated YouTube videos about Zika virus available during the recent Zika pandemic. Methods: Hundred-and-one videos were retrieved from YouTube (search term: zika virus). Based upon content, they were classified as: informative, misleading or personal experience videos. Quality and reliability of these videos were evaluated using standardized tools. The viewer interaction metrics (e.g. no. of views, shares, etc.), video characteristics (video length, etc.) and the sources of upload were also assessed; and their relationship with the type, quality and reliability of the videos analyzed. Results: Overall, 70.3% videos were informative, while 23.8% and 5.9% videos were misleading and related to personal experiences, respectively. Although with shorter lengths (P < 0.01) and superior quality (P < 0.01), yet informative videos were viewed (P = 0.054), liked (P < 0.01) and shared (P < 0.05) less often than their misleading counterparts. Videos from independent users were more likely to be misleading (adjusted OR = 6.48, 95% CI: 1.69 - 24.83), of poorer (P < 0.05) quality and reliability than government/news agency videos. Conclusion: A considerable chunk of the videos were misleading. They were more popular (than informative videos) and could potentially spread misinformation. Videos from trustworthy sources like university/health organizations were scarce. Curation/authentication of health information in online video platforms (like YouTube) is necessary. We discuss means to harness them as useful source of information and highlight measures to curb dissemination of misinformation during public health emergencies. (The full article may be accessed at the free eprint link: https://www.tandfonline.com/eprint/rwJx2g7i9Hq9mXuiI9nN/full)
Article
Background: Despite major progress in global vaccination coverage, immunization rates are falling, resulting in outbreaks of vaccine-preventable diseases. This study analyses content and source of the most popular tweets related to a recent case in Spain where an unvaccinated child contracted and later died from diphtheria. Understanding the characteristics of these tweets in the context of vaccination could inform efforts by health promotion professionals to increase their reach and impact. Methods: We extracted tweets containing keywords related to the diphtheria case (from 1 May to 15 July 2015). We explored the prevalence of terms relating to policy and misinformation and manually coded the 194 most popular tweets (retweeted 100 or more times) with regard to source, topic, tone and sentiment. Results: A total of 722 974 tweets were collected. Prevalence of terms relating to policy and misinformation increased at the onset of the case and after the death of the child. Popular tweets (194) were either pro-vaccination (58%) or neutral, with none classified as anti-vaccination. Popular topics included criticism towards anti-vaccination groups (35%) and effectiveness of immunization (22%). Popular tweets were informative (47%) or opinions (53%), which mainly expressed frustration (24%) or humour/sarcasm (23%). Popular Twitter accounts were newspaper and TV channels (15%), as well as individual journalists and authors of popular science (13.4%). Conclusions: Healthcare organizations could collaborate with popular journalists or news outlets and employ authors of popular science to disseminate health information on social media, while addressing public concerns and misinformation in accessible ways.
Article
Background: Over the last few decades, patients have increasingly been searching for health information on the Internet. This aspect of information seeking is important, especially for people affected by chronic pathologies and require lifelong treatment and management. These people are usually very well informed about the disease but are nonetheless vulnerable to hopes of being cured or saved, often amplified by misinformation, myths, legends, and therapies that are not always scientifically proven. Many studies suggest that some individuals prefer to rely on the Internet as their main source of information, often hindering the patient-doctor relationship. A professional approach is imperative to maintain confidentiality, honesty, and trust in the medical profession. Objective: we aimed to examine, in a medically supervised Italian web community (SMsocialnetwotk.com) dedicated to people with Multiple Sclerosis (pwMS), the posts shared by users and to verify the reliability of contents of posts shared by users pinpointed as Influencers through an online questionnaire. Methods: we grouped the posts published on SMsocialnetwork from April to June 2015 into those with medical content (scientifically correct or fake news), and those related to social interactions. Later, we gave a questionnaire to the community asking to identify the three users/Influencers providing the most reliable advice for everyday life with MS and the three users/Influencers providing the most useful information about MS treatments. Results: 308 posts reported scientific and relevant medical information, whereas 72 posts included pieces of fake news. 1420 posts were of general interest. Four out of the 6 Influencers had written only posts with correct medical information (3 were pwMS, 1 was a Neurologist) and never any fake news. The remaining 2 appointed Influencers (2 pwMS) had written only posts about general interests. Conclusion: the identification of fake news and their authors has shown that the latter are never appointed as Influencers. SMsocialnetwork.com acted as a "web safe environment" where the Influencers contributed by sharing only correct medical information and never fake news. We speculate that the presence of neurologists and psychologists supervising the information flow might have contributed to reduce the risk of fake news spreading and to avoid their acquisition of authoritative meaning.
Article
Why do people believe blatantly inaccurate news headlines ("fake news")? Do we use our reasoning abilities to convince ourselves that statements that align with our ideology are true, or does reasoning allow us to effectively differentiate fake from real regardless of political ideology? Here we test these competing accounts in two studies (total N = 3446 Mechanical Turk workers) by using the Cognitive Reflection Test (CRT) as a measure of the propensity to engage in analytical reasoning. We find that CRT performance is negatively correlated with the perceived accuracy of fake news, and positively correlated with the ability to discern fake news from real news - even for headlines that align with individuals' political ideology. Moreover, overall discernment was actually better for ideologically aligned headlines than for misaligned headlines. Finally, a headline-level analysis finds that CRT is negatively correlated with perceived accuracy of relatively implausible (primarily fake) headlines, and positively correlated with perceived accuracy of relatively plausible (primarily real) headlines. In contrast, the correlation between CRT and perceived accuracy is unrelated to how closely the headline aligns with the participant's ideology. Thus, we conclude that analytic thinking is used to assess the plausibility of headlines, regardless of whether the stories are consistent or inconsistent with one's political ideology. Our findings therefore suggest that susceptibility to fake news is driven more by lazy thinking than it is by partisan bias per se - a finding that opens potential avenues for fighting fake news.