ArticlePDF Available

Can’t We Just Disregard Fake News? The Consequences of Exposure to Inaccurate Information

Authors:

Abstract

People routinely encounter inaccurate information, from fake news designed to confuse audiences, to communications with inadvertent mistakes, to stories made up to entertain readers. The hope is that these inaccuracies can be easily ignored, exerting little influence on our thoughts and actions. Unfortunately, being exposed to inaccuracies leads to problematic consequences. After reading inaccurate statements, readers exhibit clear effects of those contents on their decisions and problem-solving. This occurs even when readers possess appropriate prior knowledge to evaluate and reject the inaccuracies. Exposure to inaccurate information leads to confusion about what is true, doubt about accurate understandings, and subsequent reliance on falsehoods. Interventions and technologies designed to address these effects by encouraging critical evaluation can support effective comprehension and learning.
https://doi.org/10.1177/2372732218785193
Policy Insights from the
Behavioral and Brain Sciences
2018, Vol. 5(2) 232 –239
© The Author(s) 2018
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/2372732218785193
journals.sagepub.com/home/bbs
Policy
Tweet
Fake news and false claims can lead to confusion, doubt, and
reliance on inaccurate content. What does this look like and
how can we prevent it?
Key Points
Readers are routinely exposed to inaccurate content,
including fake news, whether they realize it or not.
After reading false information, people often become
confused and doubt whether their accurate knowledge
is useful.
People often rely on the inaccurate information they
read to complete subsequent tasks.
Studies of these effects inform interventions and tech-
nologies to support comprehension and learning.
Both experts and laypeople must acknowledge the dan-
gers of encountering inaccuracies, engaging in caution
and evaluation when comprehending information.
Introduction
In times marked by unprecedented access to archived infor-
mation, up-to-the-minute news reports, and rapid communica-
tion, we routinely encounter inaccurate information—whether
we are aware of it or not. Flawed arguments, inconsistent claims,
unsubstantiated rumors, and unsupported accusations can stra-
tegically or unintentionally distort audience understandings.
Misstatements appear in formal and informal messages, and
despite well-intended retractions, may lead people astray.
Reports and narratives often change over time, rendering
what was once accepted to be true as outdated and irrelevant.
Even the fiction we read for pleasure contains inaccurate
information involving made-up characters, settings, and situ-
ations. If we relied on everything we read without carefully
contemplating the veracity of the source and content of com-
munications, we would be vastly misinformed and ill-
equipped to make effective decisions. But deciding what to
evaluate and how to evaluate proves challenging, particu-
larly difficult when accusations of fake news and bias muddy
contemporary discourse (e.g., Del Vicario et al., 2016; Lazer
et al., 2018; Rapp & Braasch, 2014; Vosoughi, Roy, & Aral,
2018).
One useful resource for dealing with the inaccurate infor-
mation encountered in daily life is prior knowledge. Existing
understandings and prior experiences, when appropriately
accessed, benefit critical evaluation. By consulting valid
understandings, people can interrogate incoming discourse to
filter out misinformation and disinformation. This epistemic
cognition involves knowledge acquisition, memory retrieval,
785193BBSXXX10.1177/2372732218785193Policy Insights from the Behavioral and Brain SciencesRapp and Salovich
research-article2018
1Northwestern University, Evanston, IL, USA
Corresponding Author:
David N. Rapp, Northwestern University, 2120 Campus Drive, Evanston,
IL 60208, USA.
Email: rapp@northwestern.edu
Can’t We Just Disregard Fake News?
The Consequences of Exposure to
Inaccurate Information
David N. Rapp1 and Nikita A. Salovich1
Abstract
People routinely encounter inaccurate information, from fake news designed to confuse audiences, to communications with
inadvertent mistakes, to stories made up to entertain readers. The hope is that these inaccuracies can be easily ignored,
exerting little influence on our thoughts and actions. Unfortunately, being exposed to inaccuracies leads to problematic
consequences. After reading inaccurate statements, readers exhibit clear effects of those contents on their decisions and
problem-solving. This occurs even when readers possess appropriate prior knowledge to evaluate and reject the inaccuracies.
Exposure to inaccurate information leads to confusion about what is true, doubt about accurate understandings, and
subsequent reliance on falsehoods. Interventions and technologies designed to address these effects by encouraging critical
evaluation can support effective comprehension and learning.
Keywords
memory, comprehension, reading, fake news, misinformation
Rapp and Salovich 233
reasoning, problem-solving, and behavioral decisions (e.g.,
Greene & Yu, 2016). These processes effectively operate
when people have (a) the appropriate and valid prior knowl-
edge to engage in evaluation and (b) the motivation and prac-
ticed skills to support it. Many instructional activities aim to
maximize the quantity and quality of prior knowledge, pro-
viding tools for engaging in skeptical inquiry. The conun-
drum is that even when people possess useful knowledge,
and even when they hold well-intentioned, well-practiced
epistemic mindsets, they can still be influenced by inaccurate
information.
Consider that people use the inaccurate information they
have recently read to make decisions and solve problems (e.g.,
Marsh, Meade, & Roediger, 2003; Rapp, 2016). For example,
when participants are tasked with reading statements containing
factual inaccuracies (e.g., “Someone, maybe an actor named
Oswald, has killed Lincoln.”), they sometimes reproduce that
information on subsequent tasks (e.g., to answer “Who assassi-
nated Abraham Lincoln?”). This occurs when people are
unaware that the information is wrong, which indicates they are
learning information that is new to them. But reliance on inac-
curacies also occurs when people should be aware the informa-
tion is wrong, given their prior knowledge. In contrast to what
population surveys and knowledge norms may suggest, people
reproduce inaccurate information that they should know is
patently incorrect. Even after being asked to provide evidence
for their accurate understandings, people’s decisions are influ-
enced by recently acquired inaccuracies (Fazio, Barber,
Rajaram, Ornstein, & Marsh, 2013; Rapp, 2008).
People’s accurate knowledge should protect them from
considering and using inaccuracies, but prior knowledge is
not always successfully leveraged in the service of compre-
hension. Despite possessing useful knowledge and their best
epistemic efforts, people are often influenced by inaccurate
claims, data, and arguments. Exposure to inaccurate infor-
mation creates at least three comprehension problems—con-
fusion, doubt, and reliance—all of which are cause for
concern given how frequently people encounter falsehoods,
including fake news.
Confusion
Prior knowledge and experiences, when valid and relevant,
are useful sources of information. A problem with exposure
to inaccurate information is that it can create confusion about
those resources, even when it should not. Consider the state-
ment, “Ronald Reagan was assassinated by John Hinckley
Jr.” Of course, Reagan survived the attempted assassination,
as most people know having learned it from news programs,
textbooks, documentaries, or history courses. But when peo-
ple read information suggesting that a well-known fact like
this should be questioned, they can exhibit confusion, even
though they should know that affirming the inaccurate state-
ment would be unreasonable.
Evidence supporting this confusion comes from psycho-
logical science examining people’s moment-by-moment
reading of texts. As people attempt to comprehend text, they
spend more time reading and rereading sentences that might
be difficult or incoherent, and less time on simple sentences
aligning with their expectations. Reading times for sentences
are therefore a useful measure for determining the potential
difficulties that readers exhibit as texts unfold. In one well-
replicated finding, readers slow down to information incon-
sistent with what they expect (the inconsistency effect; see
Rapp & Mensink, 2011, for discussion). For example, if a
character in a story is described as angry and prone to vio-
lence, readers slow down when reading subsequent sentences
in which that character behaves in an unexpected, peaceful
manner, as compared with that character behaving in an
expected, irritated manner (Rapp, Gerrig, & Prentice, 2001).
In general, reading times are useful for measuring the con-
fusions that readers should exhibit when they encounter infor-
mation that is inaccurate and runs counter to what they already
know to be reasonable. But reading times also reveal that,
following exposure to inaccurate information, people begin to
exhibit confusion about information that should be obvious
and expected. In a series of experiments, participants read
stories describing events that, based on prior norming, they
should have known to be true (Rapp, 2008). This included
well-known historical events such as “The Titanic was sunk
by icebergs” and “The South lost the Civil War.” The stories
sometimes included preceding contexts that set up the possi-
bility that the well-known event might not actually have hap-
pened. Consider the following example:
George Washington was a famous figure after the Revolutionary
War. He was a popular choice to lead the new country.
Washington, however, wanted to retire after the war. The long
years as a general had left him tired and frail. Washington was
eventually asked to run for President of the United States. He
wrote that he would be unable to accept the nomination. People
hoped that John Adams might consider running for the position.
Participants should be aware that even in the face of the
suspenseful description offered here, history played out in a
way such that Washington would become President. They
should have little difficulty reading a historically accurate
conclusion to the story (i.e., “George Washington was elected
first President of the United States.”) given it aligns with
well-known historical events. However, people’s reading
times for outcome sentences describing historically accurate
conclusions actually increased after reading story contexts
that suggested the well-known event might not have hap-
pened (Jacovina, Hinze, & Rapp, 2014).
Recall that reading slowdowns are a useful indicator of
the difficulty people exhibit integrating story sentences with
what they already know. Participants here exhibited slow-
downs for sentences containing ideas that should have, based
on norming, been familiar, and therefore they should have
234 Policy Insights from the Behavioral and Brain Sciences 5(2)
had little difficulty comprehending. The story contexts set up
expectations that ran counter to what they knew happened,
resulting in the slowdowns. These slowdowns are thus a use-
ful marker of confusion, but a confusion that should not have
emerged if readers had successfully leveraged their prior
knowledge to comprehend the story outcomes, given what
they already knew about the event. Consider the fact that a
brief description could create confusion about information
people should know well, that they tend to be very confident
about knowing, and that has been repeated throughout their
lives (and perhaps personally verified). In these experiments,
participants’ understandings of text descriptions, for which
they could and should rely on their prior knowledge, were
overcome by contexts calling that information into question.
The content they read led to confusion.
Another method for measuring reader confusion involves
asking people questions about the information they have
recently read. Responses that suggest they are contemplating
multiple possibilities, despite only one being viable, are
revealing markers of confusion. Using such responses to
examine people’s confusion after receiving mixtures of accu-
rate and inaccurate information, several experiments have
looked at the causes people mention when describing events
(e.g., H. M. Johnson & Seifert, 1998; Rich & Zaragoza,
2016; Wilkes & Reynolds, 1999). Consider the following
scenario:
January 25, 8:58 p.m: A serious fire was reported in the storage
hall, already out of control and requiring instant response.
January 26, 4:30 a.m: Message received from Police Investigator
Lucas saying that they have reports that cans of oil paint and
pressurized gas cylinders had been present in the closet before
the fire.
January 26, 10:40 a.m: A second message received from Police
Investigator Lucas regarding the investigation into the fire. It
stated that the closet reportedly containing cans of oil paint and
gas cylinders had actually been empty before the fire.
Participants in these experiments read about the causes of
various events like this warehouse fire (e.g., H. M. Johnson
& Seifert, 1994; Wilkes & Leatherbarrow, 1988). Some of
the participants read that an earlier described cause did not
offer a viable, causal explanation for the event. These readers
were presented information that retracted the earlier causal
possibility.
After reading the texts, participants answered questions
and recalled what they had read. Many participants, when
asked to indicate the cause of the fire, continued to mention
both causes, even after the earlier one had been disconfirmed.
This suggests that rather than being confident about the
appropriate cause of an event, readers exhibit confusion
about what happened, even when the text was clear as to how
and why the event had occurred. Given their confusion,
many participants opted to mention disconfirmed causes
without careful review of what might be reasonable or
expected given the text description. Some evidence even
suggests that a lack of confidence about what happened in a
given situation can increase endorsements of disconfirmed
causes (Lewandowsky, Ecker, Seifert, Schwarz, & Cook,
2012; Seifert, 2002). Confusion can lead to participant
responses that make it unclear exactly what they believe to
be accurate and inaccurate. In these studies, people read rela-
tively straightforward accounts; more convoluted descrip-
tions, containing mixtures of accuracies and inaccuracies,
would likely engender even more confusion.
Doubt
Contemplating whether to reconsider and potentially revise
prior knowledge is useful when people are uncertain about
ideas or discover what they know is wrong. An unfortunate
consequence of reading inaccurate information is that people
begin to express uncertainty about the validity of ideas they
should be confident are obviously true. Ignoring, discount-
ing, or revising useful knowledge leads to suboptimal prob-
lem-solving and decision-making that can affect an
individual’s social, mental, and physical well-being.
Consider the following assertions: Mental illness is not
contagious. Taking a foreign language broadens your mind.
Aerobic exercise strengthens your heart and lungs. Each of
these statements is true, supported by considerable empirical
evidence, and associated with policy guidelines to encourage
associated useful behaviors. When asked to judge whether an
assertion or its counter is true (e.g., Wearing a seatbelt can
increase your chances of living through an accident vs.
Wearing a seatbelt can decrease your chances of living
through an accident), people show clear preferences for the
appropriate claim (i.e., Seatbelts save lives). But after encoun-
tering information suggesting those assertions are incorrect,
participants indicate doubt as to what is preferable (Prentice,
Gerrig, & Bailis, 1997; Wheeler, Green, & Brock, 1999).
Here is an example:
Americans brush their teeth too much—in the long-run it’s
going to do us more harm than good . . . Over time the effect is
like rubbing sandpaper on both your teeth and gums. That’s why
so many people are having problems with their gums . . . tooth
brushing frequently leads to gum disease.
The assertion made in this excerpt is inaccurate; based on the
preponderance of evidence, toothbrushing is good for your
dental health. Although we might be able to imagine scenar-
ios in which toothbrushing could be harmful (e.g., brushing
too frequently with improper equipment), all things being
equal, the practice has clear benefits. Nevertheless, when
participants were asked to read texts calling well-regarded
assertions into question, they exhibited subsequent difficulty
Rapp and Salovich 235
making decisions about whether those assertions were true.
After reading the above excerpt, participants took longer to
reject the false claim “Brushing your teeth can lead to gum
disease” than if they had read an excerpt supporting the ben-
efits of brushing. Rather than making a quick decision about
the veracity of the assertions, their responses suggested they
now doubted what they knew to be true for most if not all
their lives (Gerrig & Prentice, 1991).
Doubt-ridden responses are not limited to the speed with
which people consider the truth of a claim. Participants’
actual decisions about whether statements are true are also
influenced by whether previously read text supports an accu-
rate or inaccurate claim. Participants are more likely to judge
the inaccurate statement “Brushing your teeth can lead to
gum disease” as true after reading an assertion highlighting
potential problems with toothbrushing, as compared to after
reading an assertion extolling its health benefits (Rapp,
Hinze, Kohlhepp, & Ryskin, 2014). A decision that normally
would be quite certain was now colored by doubt after read-
ing misinformation.
Doubt can also be illustrated in people’s confidence in
their responses after exposure to inaccurate information. In
many situations, people exhibit effective metacognitive
monitoring and control, being able to determine what infor-
mation to integrate into existing knowledge, and what knowl-
edge to retrieve that might be useful for completing tasks.
Some evidence indicates that despite any predilection to
reproduce falsehoods, people report higher confidence when
providing accurate as compared with inaccurate information
(e.g., Bulevich & Thomas, 2012; Weinstein, McDermott, &
Chan, 2010). These decisions, though, are influenced by the
oft-observed pattern that people tend to generally be over-
confident about what they know (Dunning, Johnson,
Ehrlinger, & Kruger, 2003). Any tendency to be overconfi-
dent would suggest that people should prefer their own
knowledge to the information presented by other sources,
particularly inaccurate ones. The problem is that reading
inaccurate information can reduce people’s confidence in the
accuracy of their knowledge, including when it is correct and
despite any general propensity toward overconfidence
(Appel & Richter, 2007). And even when people report
decreased confidence in their use of previously presented
inaccurate information, this does not seem to actually reduce
their use of that information on subsequent tasks (Donovan,
Hinze, & Rapp, 2017). This indicates an important separa-
tion between confidence and actual use of inaccurate
information.
Confidence can even predict people’s use of false infor-
mation, but not in a direction one might hope to see. People
show robust overconfidence in their ability to detect and
ignore inaccurate information, but the individuals most
likely to reproduce inaccurate ideas are also the most over-
confident (Salovich & Rapp, 2018). This exemplifies the
disconnect between people’s metacognitive evaluations of
their susceptibility to inaccuracies and their actual use of the
content. Individuals who hold low confidence in their prior
knowledge and abilities may rely on falsehoods because
they are uncertain as to what is true. In contrast, individuals
with high confidence may use inaccurate information
because they believe they are not susceptible to the adverse
effects of reading false content, and thus do not engage in
evaluation. Both cases highlight a crucial role for doubt,
either applied too broadly or failing to be applied during
comprehension.
Reliance
Perhaps the most problematic result of exposure to inaccu-
rate information involves relying on it to complete subse-
quent goals. Unfortunately, readers often reproduce
inaccuracies they have previously read on later activities.
Even falsehoods integrated into stories, positioned as the
contents of discussions between characters or as offered by a
narrator, have this effect. Consider this excerpt from a story
about two friends setting out on an ocean trip (Marsh, 2004):
Just when I decided this great big white and shiny one called The
Palace must be his boat; he stopped at the dinkiest, dirtiest, little
tub. “Here she is! I wanted to name her the Pompeii, after the
mythical island that sunk into the sea, but instead she’s the
Mayflower, unfortunately named after the pilgrim’s boat.”
This description includes both accurate information (the
name of the vessel the pilgrims used to sail to the new world),
and inaccurate information (the name of a mythical sunken
kingdom). Experiments using these materials present infor-
mation in an accurate (e.g., Atlantis; Mayflower), inaccurate
(e.g., Pompeii; Godspeed), or ambiguous form (e.g., “I
wanted to name her after the mythical island”; “but instead
she’s unfortunately named after the pilgrim’s boat.”). After
reading stories containing mixtures of these contents, partici-
pants received a trivia quiz with some questions related to
statements from the stories. For example, participants might
be asked questions such as “What is the name of the ship that
carried the pilgrims to America in 1620?” and “What is the
name of the island-city believed since antiquity to have sunk
in the ocean?”
The trivia questions offer a method for measuring
whether participants’ answers reference the earlier presented
inaccuracies (and not just a random, unmentioned inaccu-
racy). Participants are more likely to provide inaccurate
responses after reading inaccurate content than after reading
accurate or neutral content (e.g., Hinze, Slaten, Horton,
Jenkins, & Rapp, 2014; Marsh & Fazio, 2006; Marsh et al.,
2003). This indicates that people are providing incorrect
responses specifically because they acquired inaccurate
information from the stories. Participants are more likely to
reproduce those inaccuracies not only when the critical
ideas are difficult and unfamiliar but also even when the
236 Policy Insights from the Behavioral and Brain Sciences 5(2)
content is easy and familiar. Exposure to incorrect informa-
tion can therefore have clear, problematic consequences.
In these studies, the story materials were narrative fic-
tion rather than resources that people might traditionally
expect to study with, such as textbooks or research articles.
Moreover, the inaccurate information was not designed to
be integral to the overarching story plots, but rather included
to flesh out interactions between characters and background
events. Readers relied on these materials despite not being
crucial to understanding the story themes or plot. This
exemplifies the robust and problematic allure of inaccurate
content during even mundane reading experiences.
This pattern of reliance is not limited just to reading.
People tend to reproduce incorrect information provided
by their conversational partners, suggesting any lack of
evaluation may be a general outcome of memory and com-
prehension processes (Marsh, Cantor, & Brashier, 2016;
Rapp & Donovan, 2017; Rapp, Jacovina, & Andrews,
2014). In one series of experiments, participants memo-
rized sets of words associated with particular topics
(Roediger, Meade, & Bergman, 2001). This included, for
example, a list of body parts including foot, ear, toe, and so
on. Next, participants were teamed with a partner who had
studied the same list, and the pair was asked to recall the
items, taking turns to respond with one item from the list at
a time. Unbeknownst to participants, the partner was a
confederate who intentionally mentioned items from the
list category but that had not appeared in the list (e.g., leg).
After working together, the participant was also asked to
individually recall everything that had originally been pre-
sented in the lists.
Participants in this task exhibited clear use of the incor-
rect mentions provided by their partner, a phenomenon
termed social contagion. When subsequently tasked with
recalling on their own, participants often relied on their
partner’s inaccurate recalls and reported them despite
never having seen them. Participants are less likely to
show social contagion if they believe their partner is not
particularly credible (Andrews & Rapp, 2014). But when
participants possess little insight into the credibility of
their partner, they use their partner’s inaccurate contribu-
tions as much as if they had evidence of that partner’s
credibility. People’s reliance on their conversational part-
ner’s productions is of clear concern when that partner gets
important things wrong.
These experiments required participants to memorize lists
of words, which does not necessarily align with everyday
activities. For the studies described here, perhaps if the inac-
curate ideas were critical for making real-world decisions,
participants might have been more careful. But if inaccurate
statements are presented as single sentences, one after the
other for readers to explicitly ponder (akin to a Twitter feed),
the same effects emerge (Fazio, Dolan, & Marsh, 2015;
Pennycook, Cannon, & Rand, in press). Both subtle and
direct mentions of inaccurate ideas can pollute people’s
memory, understandings, and decisions.
Combating Inaccurate Influences
The three concerns described here—confusion, doubt, and
reliance—likely interact as people are exposed to inaccurate
information. Describing the problems that emerge from such
exposure helps inform the design, development, and imple-
mentation of effective activities and technologies intended to
support more successful comprehension. An overarching
goal for these supports involves explicitly promoting the
consideration of relevant and accurate prior knowledge.
One modestly beneficial task involves motivating indi-
viduals to engage in strategic evaluation. Recall that content
representing inaccurate positions adversely influences par-
ticipants’ judgments of obviously incorrect assertions.
Requiring people to consider the veracity of the same materi-
als during reading reduces those adverse consequences
(Rapp, Hinze, et al., 2014). When participants were tasked
with making direct changes and annotations to texts during
reading, they exhibited much less difficulty judging the truth
of related assertions. This was specific to the items they cor-
rected; inaccurate information they missed or neglected to
revise continued to exert an influence on their post-reading
judgments. The observed benefits have at least two potential
explanations. First, requiring readers to retrieve their accu-
rate prior knowledge during the task likely prevented them
from encoding a new memory for the inaccurate information,
rendering it unavailable for later consideration. Second, by
correcting the texts, participants may have established evalu-
ative mindsets they applied during their reading and judging.
These two possibilities are not mutually exclusive, and both
may be responsible for the observed benefits.
These findings advocate for providing people with prac-
tice consulting their knowledge to scrutinize information.
They also suggest that particular mindsets can support
skeptical approaches to content. What the findings do not
tell us, though, is how to encourage people to do so rou-
tinely. This can be a distinct challenge, given that readers
are unlikely to have the resources or motivation to enact
critical evaluation during every moment of their day. Such
behavior may not even be desirable, given it would be both
cognitively costly and a waste of time to engage in unnec-
essary evaluation. Media literacy and curriculum design
initiatives wrestle with this, in determining how to train
readers to deal with false information, fake news, and
ambiguous content. Technological interventions can poten-
tially prove supportive, offering online tools and expedient
access to the necessary information to interrogate ideas,
perhaps even suggesting and highlighting information that
should receive scrutiny.
Another beneficial way that individuals might protect
themselves from inaccurate content involves identifying
Rapp and Salovich 237
when and how particular information might be useful. The
contents of fictional stories, for example, might not always
be informative for everyday concerns. Stories with super
heroes, wizards, monsters, and aliens play fast and loose
with facts that would be inappropriate for making decisions
in the real world, including information about geographic
locations, modes of travel, survival behaviors, and so on.
Explanations of these ideas and others (e.g., science, math,
and history concepts) are associated with expository presen-
tations and materials. Fictional content, however, can prove
informative with respect to thinking about human values and
interactions, such as how we might expect to treat or be
treated by others (Mar, in press).
The degree to which people encode what they are learn-
ing into general knowledge, or keep that information sepa-
rate and specific to a particular set of circumstances (e.g.,
talking animals are commonplace in fairy tales but not in the
real world), influences the likelihood they will use that
information on subsequent tasks. Examinations of these
processes—termed integration and compartmentalization,
respectively (Gerrig & Prentice, 1991)—suggest that people
more often than not fail to separate what they know into bins
useful for different purposes. Addressing difficulties with
compartmentalization involves helping people determine
what makes some information credible and trustworthy, and
how and why to be wary of other kinds of information.
Activities that encourage source monitoring, in which peo-
ple carefully contemplate who developed and disseminated
information, when the information was obtained, and evi-
dence marshaled by those sources to make claims, can sup-
port considerations of what might be trusted or discounted
(M. K. Johnson, Hashtroudi, & Lindsay, 1993). Curricular
implementations, many involving games, potentially sup-
port sourcing practices (e.g., Graesser et al., 2010). Tutorials
on the practices that scientists, journalists, and other infor-
mation disseminators use to establish accurate accounts
could further inform people’s expectations for what consti-
tutes a valid claim. Using cues about the source of claims to
compartmentalize information should also help mitigate
confusion, doubt, and reliance.
Finally, several recent models focus on the cognitive pro-
cesses underlying people’s noticing of inaccuracies (Braasch
& Bråten, 2017; Kendeou & O’Brien, 2014). This work
highlights the need for people to recognize discrepancies
between information they read or hear and what they already
know, as well as between different sources. Detecting dis-
crepancies increases the likelihood people will establish
accurate understandings (Otero & Kintsch, 1992). Noticing
can be explicitly encouraged when texts offer clear refuta-
tions that detail an inaccurate view, provide the accurate
explanation, and supply a justification as to why the accurate
notion is a better explanatory account (e.g., Donovan, Zhan,
& Rapp, 2018; Kendeou & van den Broek, 2007). This, in
turn, should reduce reader confusion and increase confidence
in the accurate explanation. Instructional approaches and
online tools can explicitly mark when incongruities appear in
discourse materials. Similarly, beginning readers need prac-
tice identifying discrepancies, including how to resolve them
or set them aside until further information is available.
Conclusion
Routine processes of cognition support the encoding and
retrieval of information in the service of learning. This is a
good thing when information is correct, but problematic
when information is inaccurate. The availability of informa-
tion from online and interpersonal sources, colored by con-
temporary sociocultural and sociopolitical dissemination
practices, has led to concerns about the prevalence of inac-
curate information. Exposure to inaccuracies can have clear
consequences, with people becoming confused about what is
accurate, doubtful as to whether their accurate knowledge is
correct, and reliant on inaccurate ideas. Instructional activi-
ties, technological tools, and literacy practices must support
more routinized interrogation of incoming information to
contend with inaccurate content exemplified by, but not lim-
ited to, fake news.
The implications of this work prove relevant to stakehold-
ers including students, instructors, journalists, and everyday
citizens—essentially anyone who consumes and disseminates
information. It necessitates the revision of traditional instruc-
tion to address the influence of inaccurate information across
disciplines, not just in media literacy courses. This requires
vigilant awareness and discussion of the effects of exposure
to inaccuracies, as well as an understanding of how modern
information sources and contexts operate and are interpreted
by audiences. The well-replicated, empirical findings
described previously also highlight a need to train the next
generation of reporters on how falsehoods and retractions
influence readers, and how to effectively refute inaccurate
claims. Credentialed experts need to take responsibility for
serving as public evaluators of information, which should
include making transparent the methods and techniques they
use to evaluate evidence and derive conclusions. And people
need to take seriously their roles as consumers of information,
evaluating claims, monitoring sources, and seeking support-
ing evidence when they encounter information.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect
to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, author-
ship, and/or publication of this article.
ORCID iD
David N. Rapp https://orcid.org/0000-0003-4515-5295
238 Policy Insights from the Behavioral and Brain Sciences 5(2)
References
Andrews, J. J., & Rapp, D. N. (2014). Partner characteristics and
social contagion: Does group composition matter? Applied
Cognitive Psychology, 28, 505-517.
Appel, M., & Richter, T. (2007). Persuasive effects of fictional nar-
ratives increase over time. Media Psychology, 10, 113-134.
Braasch, J. L. G., & Bråten, I. (2017). The Discrepancy-Induced
Source Comprehension (D-ISC) model: Basic assumptions and
preliminary evidence. Educational Psychologist, 52, 167-181.
Bulevich, J. B., & Thomas, A. K. (2012). Retrieval effort improves
memory and metamemory in the face of misinformation.
Journal of Memory and Language, 67, 45-58.
Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A.,
Caldarelli, G., . . . Quattrociocchi, W. (2016). The spreading of
misinformation online. Proceedings of the National Academy
of Sciences of the United States of America, 113, 554-559.
Donovan, A. M., Hinze, S. R., & Rapp, D. N. (2017, April). People
utilize inaccurate information even when allowed to withhold
responses. Poster presented at the 89th Annual Meeting of the
Midwestern Psychological Association, Chicago, IL.
Donovan, A. M., Zhan, J., & Rapp, D. N. (2018). Supporting his-
torical understandings with refutation texts. Contemporary
Educational Psychology, 54, 1-11.
Dunning, D., Johnson, K., Ehrlinger, J., & Kruger, J. (2003). Why
people fail to recognize their own incompetence. Current
Directions in Psychological Science, 12, 83-87.
Fazio, L. K., Barber, S. J., Rajaram, S., Ornstein, P. A., & Marsh,
E. J. (2013). Creating illusions of knowledge: Learning errors
that contradict prior knowledge. Journal of Experimental
Psychology: General, 142, 1-5.
Fazio, L. K., Dolan, P. O., & Marsh, E. J. (2015). Learning misin-
formation from fictional sources: Understanding the contribu-
tions of transportation and item-specific processing. Memory,
23, 167-177.
Gerrig, R. J., & Prentice, D. A. (1991). The representation of fic-
tional information. Psychological Science, 2, 336-340.
Graesser, A. C., Britt, M. A., Millis, K. K., Wallace, P., Halpern,
D. F., Cai, Z., . . . Forsyth, C. (2010). Critiquing media reports
with flawed scientific findings: Operation ARIES! A game with
animated agents and natural language trialogues. In V. Aleven,
J. Kay, & J. Mostow (Eds.), Intelligent Tutoring Systems:
ITS 2010: Lecture notes in computer science (Vol. 6095,
pp. 327-329). Berlin, Germany: Springer.
Greene, J. A., & Yu, S. B. (2016). Educating critical thinkers: The
role of epistemic cognition. Policy Insights from the Behavioral
and Brain Sciences, 3, 45-53.
Hinze, S. R., Slaten, D. G., Horton, W. S., Jenkins, R. J., & Rapp,
D. N. (2014). Pilgrims sailing the titanic: Plausibility effects on
memory for misinformation. Memory & Cognition, 42, 305-324.
Jacovina, M. E., Hinze, S. R., & Rapp, D. N. (2014). Fool me twice:
The consequences of reading (and rereading) inaccurate infor-
mation. Applied Cognitive Psychology, 28, 558-568.
Johnson, H. M., & Seifert, C. M. (1994). Sources of the continued
influence effect: When misinformation in memory affects later
inferences. Journal of Experimental Psychology: Learning,
Memory and Cognition, 20, 1420-1436.
Johnson, H. M., & Seifert, C. M. (1998). Updating accounts follow-
ing a correction of misinformation. Journal of Experimental
Psychology: Learning, Memory and Cognition, 24, 1483-1494.
Johnson, M. K., Hashtroudi, S., & Lindsay, S. D. (1993). Source
monitoring. Psychological Bulletin, 114, 3-28.
Kendeou, P., & O’Brien, E. J. (2014). The Knowledge Revision
Components (KReC) framework: Processes and mechanisms.
In D. N. Rapp & J. L. G. Braasch (Eds.), Processing inaccu-
rate information: Theoretical and applied perspectives from
cognitive science and the educational sciences (pp. 353-377).
Cambridge, MA: MIT Press.
Kendeou, P., & van den Broek, P. (2007). The effects of prior knowl-
edge and text structure on comprehension processes during read-
ing of scientific texts. Memory & Cognition, 35, 1567-1577.
Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J.,
Greenhill, K. M., Menczer, F., . . . Zittrain, J. L. (2018). The
science of fake news. Science, 359, 1094-1096.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., &
Cook, J. (2012). Misinformation and its correction: Continued
influence and successful debiasing. Psychological Science in
the Public Interest, 13, 106-131.
Mar, R. A. (in press). Evaluating whether stories can promote
social cognition: Introducing the Social Processes and Content
Entrained by Narrative (SPaCEN) framework. Discourse
Processes.
Marsh, E. J. (2004). Stimuli for creating false beliefs about the
world. Behavior Research Methods, Instruments, & Computers,
36, 650-655.
Marsh, E. J., Cantor, A. D., & Brashier, N. M. (2016). Believing
that humans swallow spiders in their sleep: False beliefs as side
effects of the processes that support accurate knowledge. In B.
Ross (Ed.), The psychology of learning and motivation (Vol.
64, pp. 93-132). Cambridge, MA: Academic Press.
Marsh, E. J., & Fazio, L. K. (2006). Learning errors from fiction:
Difficulties in reducing reliance on fictional stories. Memory &
Cognition, 34, 1140-1149.
Marsh, E. J., Meade, M. L., & Roediger, H. L. (2003). Learning facts
from fiction. Journal of Memory and Language, 49, 519-536.
Otero, J., & Kintsch, W. (1992). Failures to detect contradic-
tions in a text: What readers believe versus what they read.
Psychological Science, 3, 229-236.
Pennycook, G., Cannon, T. D., & Rand, D. G. (in press). Prior
exposure increases perceived accuracy of fake news. Journal
of Experimental Psychology: General.
Prentice, D. A., Gerrig, R. J., & Bailis, D. S. (1997). What readers
bring to the processing of fictional texts. Psychonomic Bulletin
& Review, 4, 416-420.
Rapp, D. N. (2008). How do readers handle incorrect information
during reading? Memory & Cognition, 36, 688-701.
Rapp, D. N. (2016). The consequences of reading inaccurate informa-
tion. Current Directions in Psychological Science, 25, 281-285.
Rapp, D. N., & Braasch, J. L. G. (2014). Accurate and inaccurate
knowledge acquisition. In D. N. Rapp & J. L. G. Braasch
(Eds.), Processing inaccurate information: Theoretical and
applied perspectives from cognitive science and the educa-
tional sciences (pp. 1-9). Cambridge, MA: MIT Press.
Rapp, D. N., & Donovan, A. M. (2017). Routine processes of cogni-
tion result in routine influences of inaccurate content. Journal
of Applied Research in Memory and Cognition, 6, 409-413.
Rapp, D. N., Gerrig, R. J., & Prentice, D. A. (2001). Readers’
trait-based models of characters in narrative comprehension.
Journal of Memory and Language, 45, 737-750.
Rapp and Salovich 239
Rapp, D. N., Hinze, S. R., Kohlhepp, K., & Ryskin, R. A. (2014).
Reducing reliance on inaccurate information. Memory &
Cognition, 42, 11-26.
Rapp, D. N., Jacovina, M. E., & Andrews, J. J. (2014). Mechanisms
of problematic knowledge acquisition. In D. N. Rapp & J.
L. G. Braasch (Eds.), Processing inaccurate information:
Theoretical and applied perspectives from cognitive science
and the educational sciences (pp. 181-202). Cambridge, MA:
MIT Press.
Rapp, D. N., & Mensink, M. C. (2011). Focusing effects from online
and offline reading tasks. In M. T. McCrudden, J. P. Magliano,
& G. Schraw (Eds.), Text relevance and learning from text (pp.
141-164). Charlotte, NC: Information Age Publishing.
Rich, P. R., & Zaragoza, M. S. (2016). The continued influence of
implied and explicitly stated misinformation in news reports.
Journal of Experimental Psychology: Learning, Memory and
Cognition, 42, 62-74.
Roediger, H. L., Meade, M. L., & Bergman, E. T. (2001). Social con-
tagion of memory. Psychonomic Bulletin & Review, 8, 365-371.
Salovich, N. A., & Rapp, D. N. (2018, July). Readers’ perceived
resistance to misinformation is inversely related to their use of
inaccurate content. Paper to be presented at the 28th Annual
Meeting of the Society for Text & Discourse, Brighton, UK.
Seifert, C. M. (2002). The continued influence of misinformation
in memory: What makes a correction effective? Psychology of
Learning and Motivation, 41, 265-292.
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and
false news online. Science, 359, 1146-1151.
Weinstein, Y., McDermott, K. B., & Chan, J. C. K. (2010). True
and false memories in the DRM paradigm on a forced choice
test. Memory, 4, 375-384.
Wheeler, S. C., Green, M. C., & Brock, T. C. (1999). Fictional nar-
ratives change beliefs: Replications of Prentice, Gerrig, and
Bailis (1997) with mixed corroboration. Psychonomic Bulletin
& Review, 6, 136-141.
Wilkes, A. L., & Leatherbarrow, M. (1988). Editing episodic mem-
ory following the identification of error. The Quarterly Journal
of Experimental Psychology A, 40, 361-387.
Wilkes, A. L., & Reynolds, D. J. (1999). On certain limitations
accompanying readers’ interpretations of corrections in epi-
sodic text. The Quarterly Journal of Experimental Psychology
A, 52, 165-183.
... Disorientation is not only measurable in surveys but also at the cognitive level. Reading false or inconsistent information has been shown to subsequently slow down cognitive processing of correct information (Jacovina et al., 2014;Rapp, 2008;Rapp & Salovich, 2018). ...
Article
Full-text available
Research about the epistemic crisis has largely treated epistemic threats in isolation, overlooking what they collectively say about the health of news environments. This study integrates the literature on epistemic problems and proposes a broadly encompassing framework that departs from the traditional focus on falsehoods: epistemic vulnerability. This framework is an attempt to more fully capture the erosion of authority and value conferred to political information, which has put stress on the public spheres of many democracies. The study develops the EV index to quantify this phenomenon at the system level in a comparative manner. Using OLS regression, I test the relationships between the EV index and various structural characteristics of political and media systems. Findings are remarkably consistent with established typologies of media systems. Northern European countries exhibit greater epistemic resilience, while the US, Spain, and Eastern Europe are more vulnerable. The study also offers strong evidence that populism, ideological polarization, and political parallelism contribute to higher levels of epistemic vulnerability. Conversely, public media viewership and larger party systems are associated with more epistemically resilient societies.
... Because the status quo tends to be reinforced by mechanisms of wishful thinking and confirmation bias (Darley & Gross 1983;Gendler 2011), and because the deepest sources of oppression tend to conceal themselves (Fried 1979;Lugones 1990), we rarely have a clear view of the injustices that form the background of our lives. Even when these injustices come into the foreground, our perspective on them is subject to misleading psychological and social influences -some of which are malicious, and others inadvertent (Gilbert, Krull, & Malone 1990;Rapp & Salovich 2018). Attempts to promote justice are therefore at risk, unless they include epistemic work, owing to the possibility that matters may be seen through a distorting lens. ...
Article
Full-text available
The aim of this paper is to show that, for the purposes of addressing the epistemic aspects of systemic injustice, we need a notion of emancipatory attention. When the epistemic and ethical elements of an injustice are intertwined, it is a misleading idealisation to think of these epistemological elements as calling for the promotion of knowledge through a rational dialectic. Taking them to instead call for a campaign of consciousness-raising runs into difficulties of its own, when negotiating the twin risks of being presumptuous about one’s own ignorance, and patronising in attributing ignorance to others. To arrive at a better response, we should follow Marilyn Frye’s suggestion that the epistemic aspects of injustice are, at root, problems of attention. But we fail to give an adequate account of this if we adhere to the most influential tradition of thinking about attention’s ethics, which takes its lead from Iris Murdoch’s reading of Simone Weil. That tradition addresses attention’s significance in individual contexts, rather than social ones. To get a better conception of the role that is played by attention in projects of social emancipation, we should take some ideas from recent work on the metaphysics of attention, together with ideas from an older tradition – represented here by R.G. Collingwood’s The Principles of Art – concerning the forms of attention that are occasioned by the creation and appreciation of art.
... Moreover, there were marginal effects that may be worth considering if they were replicated in future studies on reading medium and epistemic emotions (confusion was marginally higher for screens compared to paper and memory for the first article, with the misinformation, was marginally higher for paper compared to screens). For example, a follow up with reading times or other moment-bymoment measures may be informative to better understand the unfolding of updating misinformation (Rapp & Salovich, 2018), beyond the current study's approach of measures only after reading. Given that the trustworthiness of the source influences processing of conflicting information (Londra & Saux, 2023), that may be an important factor to consider in future studies. ...
Article
Full-text available
It is well known that misinformation’s effects on memory linger, referred to as the continued influence effect, even after reading corrections. However, it is uncertain how the reading medium and epistemic emotions (relevant to knowledge construction) relate to the continued influence effect. In this study, college students (N = 84) read about fictional news events, each with the first article stating misinformation and the second providing corrected updated information, on paper and on screen and then indicated their emotions experienced in a within-subjects experiment. There were no reliable differences in the continued misinformation effect by reading medium. However, the associations between epistemic emotions and misinformation ratings appeared to be more robust when reading from paper than screens. There were no reliable differences in epistemic emotions experienced by reading medium. The findings indicate that reading news online does not appear to relate to susceptibility of misinformation.
... In another study, undergraduates with lower attentional inhibition scores showed poorer comprehension of a science hypertext with irrelevant pictures, suggesting they were more easily seduced by the images during navigation (González et al., 2019). Inhibition differences may also play a role when readers revise misconceptions (Butterfuss & Kendeou, 2020), a skill crucial to reduce the impact of inaccurate and/or misleading information (e.g., Rapp & Salovich, 2018). Finally, a recent study (Caccia et al, 2019) analyzed how cognitive functions predicted high school students' performances on the Online Research and Comprehension Assesment (ORCA), a tool designed to measure online search and comprehension skills by solving reading tasks in Internet-like scenarios. ...
Article
Full-text available
Digital literacy can be defined as a set of skills for searching, accessing, navigating, integrating, and evaluating the reliability of information on the internet, mainly in written form. The present study aimed to examine the contribution of executive function skills (inhibition, cognitive flexibility and working memory) to performance on a digital literacy assessment tool in college undergraduates. Eighty-five Argentinean university undergraduates (56 female, mean age 20.34 ± 4.6 years) completed a digital literacy test and a computerized executive functions battery. After controlling for vocabulary scores and sociodemographic factors, cognitive flexibility was the main predictor of digital literacy scores. In particular, it was associated with information search and integration task performances. Visuospatial working memory predicted information search scores and inhibition correlated positively with reliability evaluation tasks. Our results indicated: 1) a general involvement of cognitive flexibility in digital literacy, possibly reflecting the need to shift between multiple documents, reading or navigation strategies; 2) a specific link between visuospatial working memory and information search, which might be indicating a spatial processing load related to hyperlink navigation; and 3) a possible involvement of inhibition in the evaluation of web documents, which may contribute to the suppression of irrelevant information.
... Desde tal punto de vista la industria periodística se enfrenta a un conjunto de retos de la más diversa naturaleza que está siendo profusamente analizado por la literatura científica: estratégicos (Choi, 2016), productivos (Bird, 2011;Salaverría, 2017), narrativos (Jenkins, 2009; Moloney, 2011) y de negocio (Vara y Díaz Espina, 2015;Schnell, 2018). Desde un enfoque estratégico dichos retos se refieren a realidades como el rol a desempeñar ante el continuado y creciente flujo de contenidos falsos o fake news anteriormente referido (Rapp y Salovich, 2018;Britt, Rouet, Blaum y Millis, 2019), la ciberseguridad aplicada no solamente a medios sino también a profesionales de la información y audiencias (Taylor, 2015;Thorsen, 2017), el funcionamiento de los algoritmos en los procesos de búsqueda y selección de contenidos (Diakopoulos, 2014;Parra, Edo y Rodríguez, 2019) o la accesibilidad en un entorno caracterizado por el envejecimiento de la población (Brown y Hollier, 2015; Abuaddous, Jali y Basir, 2016). ...
Article
Full-text available
La transformación de naturaleza disruptiva que incide sobre la industria de la información conlleva no sólo una pérdida de influencia sobre la sociedad, sino que implica la proliferación de todo tipo de contenidos de dudosa veracidad que contribuyen a horadar la solidez de las instituciones democráticas a escala internacional y en la Unión Europea en particular. El análisis de un proyecto como IJ4EU permite una mejor comprensión de las singularidades que hoy en día adopta el periodismo colaborativo de investigación como fórmula para incrementar la eficiencia en el flujo de contenidos noticiosos cross-border y la calidad del periodismo y como una actuación que trasciende el hecho concreto de la mera acción informativa y se adentra en el fortalecimiento de la democracia apostando por la generación de cambios profundos en los modos de trabajo y en la propia estructura del sistema que han de producirse en el medio y largo plazo.
Article
Full-text available
Este artículo tiene como principal objetivo analizar de qué modo las empresas españolasespecializadas en fact-checking emplean un formato como el pódcast para cumplir sus propósitos.La selección muestral contempla cuatro pódcast generados por tres de estas corporaciones y un totalde 482 episodios examinados, cuyo estudio se lleva a cabo mediante la combinación de un análisiscuantitativo y de un examen cualitativo basado en el análisis de contenido. Se percibe una ampliavariedad en las facetas de tratamiento informativo, ritmo narrativo, empleo de recursos humanos ysonoros y rango de duración y una homogeneidad en su carácter prioritariamente semanal.
Preprint
Full-text available
The mechanisms by which individuals evaluate the veracity of uncertain news and subsequently decide whether to seek additional information to resolve uncertainty remain unclear. In a controlled experiment participants assessed non-partisan ambiguous news and made decisions about whether to acquire extra information. Interestingly, confidence in their judgments of news veracity did not reliably predict actual accuracy, indicating limited metacognitive ability in navigating ambiguous news. Nonetheless, the level of confidence, although uncalibrated, was the primary driver of the demand for additional information about the news, with lower confidence driving a greater demand, regardless of its veracity judgment. This demand for disambiguating information, driven by the uncalibrated metacognition, was increasingly ineffective as individuals became more enticed by the ambiguity of the news. Our findings highlight how metacognitive abilities shape decisions to seek or avoid additional information amidst ambiguity, suggesting that interventions targeting ambiguity and enhancing confidence calibration could effectively combat misinformation. Main Text
Chapter
Full-text available
Readers focus on aspects of texts that are important or informative for their reading experiences. What is considered important emerges as a function of instructional tasks, reader goals, and the nature of the text content. To date, studies of reader focus have examined what readers pay attention to during reading (i.e., online), and how that impacts what is remembered after reading is completed (i.e., offline). Many of these studies have obtained convergences between online and offline tasks, showing that what readers pay attention to maps directly onto what they are likely to remember later. However, some studies have also shown divergences between online and offline tasks. In the current chapter, we discuss how the particular methodologies employed in online and offline tasks encourage different profiles of reader attention, which might influence whether convergences or divergences are obtained.
Article
Full-text available
The 2016 U.S. presidential election brought considerable attention to the phenomenon of “fake news”: entirely fabricated and often partisan content that is presented as factual. Here we demonstrate one mechanism that contributes to the believability of fake news: fluency via prior exposure. Using actual fake-news headlines presented as they were seen on Facebook, we show that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week. Moreover, this “illusory truth effect” for fake-news headlines occurs despite a low level of overall believability and even when the stories are labeled as contested by fact checkers or are inconsistent with the reader’s political ideology. These results suggest that social media platforms help to incubate belief in blatantly false news stories and that tagging such stories as disputed is not an effective solution to this problem. It is interesting, however, that we also found that prior exposure does not impact entirely implausible statements (e.g., “The earth is a perfect square”). These observations indicate that although extreme implausibility is a boundary condition of the illusory truth effect, only a small degree of potential plausibility is sufficient for repetition to increase perceived accuracy. As a consequence, the scope and impact of repetition on beliefs is greater than has been previously assumed.
Article
Full-text available
Addressing fake news requires a multidisciplinary effort
Article
Full-text available
The 2016 US Presidential Election brought considerable attention to the phenomenon of “fake news”: entirely fabricated and often partisan content that is presented as factual. Here we demonstrate one mechanism that contributes to the believability of fake news: fluency via prior exposure. Using actual fake news headlines presented as they were seen on Facebook, we show that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week. Moreover, this “illusory truth effect” for fake news headlines occurs despite a low level of overall believability, and even when the stories are labeled as contested by fact checkers or are inconsistent with the reader’s political ideology. These results suggest that social media platforms help to incubate belief in blatantly false news stories, and that tagging such stories as disputed is not an effective solution to this problem. Interestingly, however, we also find that prior exposure does not impact entirely implausible statements (e.g., “The Earth is a perfect square”). These observations indicate that although extreme implausibility is a boundary condition of the illusory truth effect, only a small degree of potential plausibility is sufficient for repetition to increase perceived accuracy. As a consequence, the scope and impact of repetition on beliefs is greater than previously assumed.
Article
Readers depend on prior knowledge to successfully comprehend texts. However, they often hold misconceptions, defined here as inaccurate or incomplete conceptualizations, which can interfere with the acquisition of information and the subsequent use of any acquired knowledge. A variety of approaches have focused on modifying text content to address readers’ misconceptions with the goal of supporting accurate understandings. One empirically validated modification involves the use of refutation texts, which identify a misconception, tag it as inappropriate, and provide a more accurate account. To date, the success of refutation texts has predominantly been studied and applied in science domains (e.g., physics). The current project investigated whether similar benefits might also emerge for history topics. Across two experiments, we pre-screened participants for misconceptions about a well-known event in American history and the Civil Rights Movement: Rosa Parks’ refusal to give up her bus seat. Participants read either a refutation or non-refutation text about the topic and completed measures assessing their moment-by-moment comprehension and post-reading memory of the material. Refutation texts led to greater learning gains in terms of recall task performance and questionnaire responses than did non-refutation texts, providing preliminary evidence for the utility of refutations for supporting readers’ valid historical understandings. Participants also read refutation content faster than non-refutation content, indicating differential processing. The results are discussed with respect to the generalizability of refutation texts for history learning as informed by contemporary models of knowledge revision.
Article
Stories have long been theorized to influence how we perceive our social world and our peers. Empirical research on this topic has begun to grow, with many studies exploring how stories and social cognition relate, across a range of different approaches. In order to structure past work and guide future investigations, this article presents a research framework that formalizes how, when, and why engagement with stories might promote social cognition. This Social Processes and Content Entrained by Narrative (SPaCEN) framework posits that stories could bolster social cognition either through (1) frequent engagement of social-cognitive processes or (2) the presentation of explicit content about social relations and the social world. These two accounts are not mutually exclusive, and both rest on different sets of necessary tenets. An example is provided to illustrate the utility of this framework, evaluating the extant work on whether exposure to stories can improve mentalizing.
Article
Lies spread faster than the truth There is worldwide concern over false news and the possibility that it can influence political, economic, and social well-being. To understand how false news spreads, Vosoughi et al. used a data set of rumor cascades on Twitter from 2006 to 2017. About 126,000 rumors were spread by ∼3 million people. False news reached more people than the truth; the top 1% of false news cascades diffused to between 1000 and 100,000 people, whereas the truth rarely diffused to more than 1000 people. Falsehood also diffused faster than the truth. The degree of novelty and the emotional reactions of recipients may be responsible for the differences observed. Science , this issue p. 1146
Article
Despite the importance of source attention and evaluation for learning from texts, little is known about the particular conditions that encourage sourcing during reading. In this article, basic assumptions of the discrepancy-induced source comprehension (D-ISC) model are presented, which describes the moment-by-moment cognitive processes that readers undergo when reading-to-understand controversial messages. We then review supporting evidence from single and multiple text comprehension research. In the discussion, we draw conclusions based on the theoretical and empirical research, highlight limitations of what is known to date, and suggest how further investigations of D-ISC might address these concerns.
Article
We are regularly confronted with statements that are inaccurate, sometimes obviously so. Unfortunately, people can be influenced by and rely upon inaccurate information, engaging in less critical evaluation than might be hoped. Empirical studies have consistently demonstrated that even when people should know better, reading inaccurate information can affect their performance on subsequent tasks. What encourages people’s encoding and use of false statements? The current article outlines how reliance on inaccurate information is a predictable consequence of the routine cognitive processes associated with memory, problem solving, and comprehension. This view helps identify conditions under which inaccurate information is more or less likely to influence subsequent decisions. These conditions are informative in the consideration of information-design approaches and instructional methods intended to support critical thinking.