ChapterPDF Available

Abstract and Figures

When risky technologies are debated in the media or when cases of scientific misconduct are made public, inevitable discussions arise about public loss of trust in science. However, trust in science reaches far beyond such incidents: trust is of much more fundamental importance for science. Clearly, trust is pivotal in doing science, since researchers in their everyday practice rely on the knowledge produced by other experts with different specialization and expertise. In the same way, trust is fundamental for the public understanding of science. Laypeople depend on the knowledge of scientific experts when developing a personal stance on science-based issues and arriving at decisions about them. Laypeople only possess a bounded understanding of science, but nowadays they are able to rapidly access all kinds of scientific knowledge online. To deal with scientific information, laypeople have to trust in scientists and their findings. We will at first describe the role of trust in doing and understanding science. Then a summary of international survey results on the general public’s trust in science are presented. Starting from these results and questions that arise from them, we extend and revise past conceptualizations of trust, arriving at a conceptualization of epistemic trust. Epistemic trust rests not only on the assumption that one is dependent on the knowledge of others who are more knowledgeable; it also entails a vigilance toward the risk to be misinformed. Drawing on empirical findings, we argue that the critical characteristics that determine the epistemic trustworthiness of a source of science-based information (for example, a scientist or a scientific institution) are the source’s expertise, integrity and benevolence. These characteristics have already been described in the model of trust provided by Mayer et al. (1995), but when it comes to trust in context of science, they must be redefined. Furthermore, trust judgments are not based solely on these characteristics, but depend on further constrains, which will be discussed in this chapter.
Content may be subject to copyright.
Trust in Science and the Science of Trust
Friederike Hendriks, Dorothe Kienhues, and Rainer Bromme
Abstract When risky technologies are debated in the media or when cases of
scientific misconduct are made public, inevitable discussions arise about public loss
of trust in science. However, trust in science reaches far beyond such incidents:
trust is of much more fundamental importance for science. Clearly, trust is pivotal
in doing science, since researchers in their everyday practice rely on the knowledge
produced by other experts with different specialization and expertise. In the same
way, trust is fundamental for the public understanding of science. Laypeople
depend on the knowledge of scientific experts when developing a personal stance
on science-based issues and arriving at decisions about them. Laypeople only
possess a bounded understanding of science, but nowadays they are able to rapidly
access all kinds of scientific knowledge online. To deal with scientific information,
laypeople have to trust in scientists and their findings. We will at first describe the
role of trust in doing and understanding science. Then a summary of international
survey results on the general publics trust in science are presented. Starting from
these results and questions that arise from them, we extend and revise past concep-
tualizations of trust, arriving at a conceptualization of epistemic trust. Epistemic
trust rests not only on the assumption that one is dependent on the knowledge of
others who are more knowledgeable; it also entails a vigilance toward the risk to be
misinformed. Drawing on empirical findings, we argue that the critical character-
istics that determine the epistemic trustworthiness of a source of science-based
information (for example, a scientist or a scientific institution) are the sources
expertise, integrity and benevolence. These characteristics have already been
described in the model of trust provided by Mayer et al. (1995), but when it
comes to trust in context of science, they must be redefined. Furthermore, trust
judgments are not based solely on these characteristics, but depend on further
constrains, which will be discussed in this chapter.
Keywords Epistemic trust • Trust in science • Trust • Public understanding of
science • Science communication • Division of cognitive labor
F. Hendriks (*) • D. Kienhues • R. Bromme
University of Mu
¨nster, Mu
¨nster, Germany
e-mail: f.hendriks@uni-muenster.de;kienhues@uni-muenster.de;bromme@uni-muenster.de
©Springer International Publishing Switzerland 2016
B. Blo
¨baum (ed.), Trust and Communication in a Digitized World, Progress in IS,
DOI 10.1007/978-3-319-28059-2_8
143
f.hendriks@uni-muenster.de
1 Introduction
Public loss of trust in science is a topic of much discussion, especially when
scientific misconduct becomes evident. For example in 2011, Diederik Stapel was
suspended from Tilburg University after it became known that he had manipulated
data and faked the results of various experiments. Investigations brought to light
that more than 30 publications were based on fraudulent data, a misdemeanor he has
since admitted (Stapel 2012). His legal case received considerable attention, lead-
ing to a discussion about the effects that fraud in science might have on public trust
in science. Confronted with news about plagiarism or manipulation of data, it
becomes obvious for the public that the everyday practice of scientific work and
science communication is based on trust (Vetenskap and Allmanhet 2015).
But trust is not only an issue when researchers abandon the rules of scientific
conduct (for example, by deliberately faking scientific results). Such cases are
actually exceptional. In reality, trust is inevitable and essential for scientists for
doing science (trust within science) as well as for the general public dealing with
science-related topics in their everyday life. In other words, trust is critical for
insidersas well as outsiders(Feinstein 2011).
In the following chapter, we will at first elaborate why trust is essential in the
context of science and why it is therefore an important topic for empirical research
about science and its public understanding. Second, we will provide an overview
about the state of public trust in science as it has been depicted in surveys on public
attitudes toward science. Referring to these findings, we will then discuss the
specificities of trustin the context of public understanding of science. In doing
so, we will start from the same general understanding of trust as it applies to trust
research in other domains and, henceforth, as most of the other chapters in this
volume. We will then specify and extend this research toward epistemic trust as
conceptual framework and point to the special importance of epistemic trust in a
digitized knowledge society.
2 Why Trust is Essential in the Context of Science
Trustis most typically ascribed as a kind of assumption about others. Whenever
people are dependent on agents (persons, organizations) and whenever they are
willing to accept the risks that come along with this dependency, they put trust into
these agents (the trustees) (see Bl
obaum 2016). This general notion of trust implies
that there are degrees of freedom for the trustor (for example he/she does not have
to purchase products offered by the trustee) and that there is some risk that he/she
cannot control (for example, the trustor cannot be sure if the product works as
promised, and if the product fails, this would be detrimental according to the goals
of the trustor). This has been described as the willingness to be vulnerable to
another person (e.g., Mayer et al. 1995). However, this core idea of trusthas to
144 F. Hendriks et al.
f.hendriks@uni-muenster.de
be refined and specified when it comes to knowledge. In this case, the goods that the
trustee provides to the trustor is knowledge, and the risk to the trustor is his/her
vulnerability to a lack of truth or validity of that knowledge.
The Paradox of Trust in Science Trust in Science
1
does address a paradox, as
Science has evolved as a means to question and readdress established facts. As
such, the very idea of modern Science is to know the truth instead of just trusting
what you are told. This is based on the enlightenment idea that everybody should be
able to overcome the vulnerability of not being told the truthby empowering their
own capabilities to think and to know. The emergence of modern Science in the
sixteenth century was based on the idea that the truth of knowledge could be
established by epistemic acts (by seeing, hearing, or providing correct impressions
about nature) and by rational conclusions based on such impressions. Thus, Science
is no longer based on the faith in assertions that have been put forward by
authorities. As mentioned by Sperber et al., “Historically, this individualistic stance
could be seen as a reaction against the pervasive role in Scholasticism of arguments
from authority” (2010, p. 361). Understanding and doing Science is a way of
controlling the risk of not getting the truth.
Referring back to the core notion of trust described above, this might at first
glance imply that trust conflicts with Science. The legacy of modern Science, as a
way of knowing and learning about the world that does not rely on trust, might be
the reason why many seminal accounts on the philosophy of Science even do not
mention the concept of trust at all. Only a small group of philosophers and
historians of Science have emphasized the role of trust in Science, mostly on how
it relates to doing Science and on communication among scientists (to a lesser
degree on Science communication with the public). For example, in his seminal
paper, Hardwig (1991) points to some proofs in modern mathematics that cannot be
checked personally by most mathematicians. Thus, they must trust the claims of
those who have processed the proofs. Obviously, interdisciplinary work is deeply
reliant on cooperation and trust between experts of different knowledge areas
(Origgi 2010; Whyte and Crease 2010; Wilholt 2013). Even within the same
research team, trust in the knowledge of others is essential for everyday scientific
practice. “Whenever the relevant evidence becomes too extensive or too complex
for any one person to gather it all” (Hardwig 1991, p. 698), it is more advantageous
to rely on the testimony of others than on ones own empirical evidence (Chinn
et al. 2011).
The Division of Cognitive Labor The perpetual construction of new knowledge and
the discovery of new scientific phenomena leads not only to individuals gaining
1
Scienceis a notion for a cognitive as well as a concrete social endeavor. The notion of Science
is used to refer to distinguished bodies of knowledge (Science in a cognitive sense) as well as to
abstractly refer to the institutions and people who are producing and maintaining these bodies of
knowledge (Science in a social sense, see also Longino 2002). In order to emphasize that both
meanings are covered, when not specified otherwise, we use a capital S.
Trust in Science and the Science of Trust 145
f.hendriks@uni-muenster.de
ever more specialized expertise, but it also transforms scientific disciplines as new
sub-disciplines evolve. The need for trust within as well as in Science is an
immediate result of the specialization of knowledge (areas) (Barber 1987; Rolin
2002). Insofar, the constitutive role of trust for doing, using and understanding
Science follows immediately from the division of cognitive labor (Bromme
et al. 2010; Bromme and Thomm 2015; Keil 2010; Keil et al. 2008). Theoretical
approaches on trust in Science point out that trust is a constituent for our contem-
porary society in which specialization and complexity are ubiquitous (Barber 1987;
Rolin 2002). However, the body of scientific knowledge is not the only thing that is
continuously growing: so is the publics need for information about scientific
topics. For example, in the realm of citizenship the public is continuously chal-
lenged to form opinions about Science related issues (e.g., “do I oppose nuclear
energy?”) or even act upon those opinions (e.g., “should I then invest in solar power
for my personal home”).
The Complexity of Scientific Knowledge As mentioned above, trust is not only
essential for scientists (trust within Science), but also and even more so for the
general public, the outsiders(trust in Science). The emergence of Science in
modern history has also been a history of separating a scientific understanding
from an everyday understanding of the world (Wolpert 1992). Scientific knowledge
has become more and more abstract, tied to cognitive artifacts (like mathematical
models) and technological tools (like microscopes, MRT devices) for the produc-
tion of data (Daston and Galison 2007). The division of cognitive labor between
scientists and the general public raises boundaries for the Public Understanding of
Science (Bromme and Goldman 2014). Most scientific phenomena defy firsthand
experience, as observation is impossible (e.g., oxygen, electrons, genes) without
means that are only accessible to scientists. Furthermore, fully comprehending
information about such phenomena does require specialized knowledge possessed
by only a few experts (e.g., understanding why oxygen is existential in some
concentrations but toxic in others). Also many topics of everyday interest to the
public (socio-scientifictopics) cannot be fully understood without deep scientific
knowledge. Consider discussions about nuclear energy, climate change or stem
cells, to name just a few. Most people possess only a bounded understanding of the
underlying Science of such topics (while political or social consequences may be
more easily accessible to them). Thus, we can conclude from the division of
cognitive labor, going hand in hand with the complexity of our knowledge society,
that a full Public Understanding of Science is unfeasible. Instead, a public trust in
Science is essential.
Easy Access to Science-Related Information in Digitized Societies The need for
trust in Science has increased because of digitalization (especially the Internet).
Nowadays, people in many countries (except those in which economical, educa-
tional or political conditions prevent it) have easy access to scientific information
via the Internet. Young people especially search online to find out about scientific
146 F. Hendriks et al.
f.hendriks@uni-muenster.de
issues and research information (Anderson et al. 2010). In the U.S. in 2012 (Besley
2014), around 42% of survey respondents mentioned that the Internet is their primary
source of information about Science and technology, replacing the TV, which was
only named by 32 % of respondents. Of all the respondents that named the Internet as
their number one source of science-based information, 63 % said that they actually
read online newspapers, while less than 10 % mentioned blogs. When searching for
information about specific scientific issues, 63 % of Americans make use of online
sources (Besley 2014). The Wellcome Trust Monitor found that in the United
Kingdom, 63 % of adults and 67 % of young people choose to search the Internet
when they were actively looking for scientific information (Wellcome Trust 2013). In
all of Europe, similar results are found (European Commission 2013). This concurs
with a disintermediation through digital technologies (Eysenbach 2008)—as opposed
to in traditional news media, there are far fewer gatekeepers, like journalists or
editors, online. This is due to very low costs of publishing and lack of quality control
when compared to traditional publications, which employ peer review (Eysenbach
2008). Hence, recipients must take it upon themselves to gather and evaluate infor-
mation about a (scientific) topic. One can imagine that finding reliable information is
difficult when considering thousands of potentially relevant websites that are
published by an equally great number of sources of varying trustworthiness. Conse-
quently, for laypeople in a digitized world who are confronted with an overwhelming
amount of information, it is essential that they are able to judge whom to believe.
That is, judgments about who is a trustworthy source of information and who may
provide relevant information about anissue are crucial (Bromme et al. 2010; Bromme
et al. 2015; Hendriks et al. 2015a).
3 How much does the Public Trust Science?
As argued above, a full understanding of even of those segments of scientific
knowledge that are relevant for our lives is unfeasible. In consequence, it is crucial
that the public trusts Science. In fact, various representative research attempts that
claim to study the general Public’s Understanding of Science focus instead on
attitudes, behaviors and activities of the general public—these issues are more
closely related to peoples trust in rather than their understanding of Science.
Therefore, in the following section we will overview such recent representative
survey studies, as they give various hints about how much the public trusts Science.
We took the liberty to subsume all items (printed in italics) that point to public trust
in Science, and subdivided these topic into items that focus on the publicsgeneral
appreciation of Science,general trust in Science, and trust in Science in the context
of specific topics. We chose surveys from the U.S. and from several European
countries (see appendix for detailed survey information). Data on the U.S. publics
views on Science are presented in the National Science Boards report on attitudes
Trust in Science and the Science of Trust 147
f.hendriks@uni-muenster.de
about Science and technology in its Science and Engineering Indicators (Besley
2014) and in the Pew Research Centers survey on Public and Scientists Views on
Science and Society (Pew Research Center 2015). Data from representative samples
of all 27 European Union member states are provided by a Special Eurobarometer:
Responsible Research and Innovation (RRI), Science and Technology (European
Commission 2013). Furthermore, we include three European national surveys, the
Ipsos Mori Public Attitudes to Science survey from the United Kingdom (Castell
et al. 2014), the German Wissenschaftsbarometer (Wissenschaft im Dialog [WiD]
2014) and the Swedish VA Barometer (Vetenskap and Allmanhet 2015).
The General Appreciation of Science In all surveys, when asked about the out-
comes of Science, the public holds a rather positive and optimistic view about
Science in general. Respondents of all surveys mostly agree with the statement
science makes life easier: namely, 79 % of American (Pew Research Center 2015),
81 % of British (Castell et al. 2014), 66 % of European (asked does science make
life easier, more comfortable and healthier) (European Commission 2013) and 74 %
of Swedish (asked if scientific developments in the last 10–20 years have made life
easier for ordinary people) respondents agree (Vetenskap and Allmanhet 2015).
Moreover, most Americans (90 %) have a great deal or some confidence in the
leaders of the scientific community; only the military is trusted more (Besley 2014).
In addition, 70 % of Americans (Besley 2014) and 55 % of British (Castell
et al. 2014) respondents agree that the benefits of science outweigh the harmful
effects. The German Wissenschaftsbarometer (Wissenschaft im Dialog [WiD]
2014) asked if science is more harmful than beneficial, and 68 % of respondents
reject the statement. Furthermore, 77 % of Europeans agree that science has a
positive influence on society, and among them 17 % regard the influence as very
positive. In contrast, only 10 % of respondents to this question think that science has
a negative impact on society (European Commission 2013). In the United King-
dom, 90 % of respondents agree (among them, 46 % strongly agree) that scientists
make a valuable contribution to society (Castell et al. 2014).
Also, regarding scientists and their actions within the realm of Science, it seems
that that the public mainly perceives scientists to have good intentions. In the U.S.,
86 % of respondents think that scientists work for the good of humanity and the
same amount of respondents thinks that scientists work on things that will make life
better for the average person (Besley 2014). In Europe, 82 % of respondents think
that scientists working at a university behave responsibly toward society by paying
attention to the impact of their science or technology related activities. Only 66 %
of respondents agreed with this statement when it concerned scientists working in
private company laboratories (European Commission 2013). In the United King-
dom, scientistsintentions are judged with a bit more reservation. Of respondents,
83 % agree (among them, 27 % strongly agree) that scientists want to make life
better for the average person. However, 27 % of respondents agree with the
statement that science benefits the rich more than the poor. Still, 48 % of respon-
dents disagree with this statement (Castell et al. 2014).
148 F. Hendriks et al.
f.hendriks@uni-muenster.de
General Trust in Science In current surveys, only a few items focus directly on the
trust people put in scientific knowledge claims. For example, 52 % of British
respondents agree that the information they hear about science is generally true,
elaborating that they had no reason to doubt it (40 %) or that they believed other
scientists had checked it (15 %). Regarding scientistsassumed competence, in the
Eurobarometer survey, 66 % of respondents agree that university scientists are
qualified to give explanations about the impact of scientific and technological
developments on society, outdoing all other groups. In contrast, scientists working
in private company laboratories are regarded to be qualified in this regard by only
35 % of respondents (European Commission 2013). Also, there are some data
suggesting that respondents not only regard scientists to be able to inform and
advise society, but also that they have the integrity to make truthful claims.
Regarding the attitudes of the public about the honesty, ethicality and integrity of
scientists, only data from Europe are available. In the Eurobarometer, 54 % of
respondents are concerned that the application of science and technology can
threaten human rights and 61 % think that researchers should not be allowed to
violate fundamental rights and moral principles to make new discovery; conversely,
29 % believe that researchers should be allowed to do this in some special cases. In
addition, 84 % of respondents believe that all researchers should receive manda-
tory trainings on scientific research ethics (like privacy, animal welfare, etc.) and
81 % agree that scientists should be obliged to declare possible conflicts of interest,
such as sources of funding, when advising for public authorities (European Com-
mission 2013). In the United Kingdom, respondents are mostly convinced that they
can trust university scientists and researchers from university to follow the rules
and regulations of their profession (90 % agreement); again, respondents agreed
with this the most for university scientists and researchers compared to all other
groups. For example, only 60 % of respondents agree that scientists working for
private companies follow the rules and regulations of their professions. Further-
more, scientists are regarded to be honest by 71 % of British respondents. However,
when asked if scientists adjust their findings to get the answers they want, 35 % of
British respondents agree, while 34 % disagree (Castell et al. 2014).
By large, the public seems to be very positive about the benefits that Science has
to offer society, and, related to these expectations, the public mostly trust scientists
to produce reliable knowledge of good quality, not biased and adhering to scientific
principles. But this is not blind trust, as answers also reflect a kind of suspicion
about vested interests when research is funded by private companies [these findings
are in line with earlier work by Critchley (2008)]. Furthermore, relevant propor-
tions of the public take into account that scientists might not adhere to the standards
of objectivity.
Trust in Science in the Context of Specific Topics In contrast to its generally fairly
trustful view of Science, when specific topics are considered, the public varies
widely in its amount of trust in Science: The National Science BoardsScience and
Trust in Science and the Science of Trust 149
f.hendriks@uni-muenster.de
Engineering Indicators show that 25 % of respondents consider genetically mod-
ified foods to be very or extremely dangerous, and 57 % are in favor or strongly in
favor of nuclear energy. In the U.S., nanotechnology is not very controversial, as
only 11 % of respondents think the harms outweigh positive benefits, while 43 %
hold no opinion (Besley 2014). The Pew Research Centers survey also shows that
U.S. adults are skeptical towards some scientific topics. For example, only 37 % of
respondents agree it is safe to eat genetically modified foods, and only 28 % think it
is safe to eat food grown with pesticides. Also in this survey, 50 % of U.S. adults
agree that climate change is due to human activity, 68 % favor the use of
bioengineered fuel and 45 % favor nuclear power plants. In spite of this, 79 % of
U.S. adults agree that science has a positive impact on the quality of health care,
62 % believe in sciences positive effect on food (Pew Research Center 2015).
While the Eurobarometer holds no data on specific scientific topics, other data from
Europe can provide some insights. In the United KingdomsIpsos Mori survey,
again it is reported that for specific scientific topics, the publics trust is inconsis-
tent. Asked, if the benefits outweigh the risks, 84 % agree regarded benefits to be
dominant for vaccination and 66 % of respondents agreed the same thing for
renewable energy. Also, 57 % of the questioned British adults agree that benefits
outweigh the harms regarding stem cell research while only 38 % agree to this for
nanotechnology. For only a few topics, more than 10 % agree that the harms
actually outweigh the benefits. For example, regarding nuclear power, 28 %
agree that the harms outweigh the benefits, while 48 % agree that the benefits
outweigh the harms. For genetically modified crops, 28 % see the harms to be
dominant, while 36 % believe the benefits outweigh the harms (Castell et al. 2014).
In the German Wissenschaftsbarometer, respondents were asked how much they
trust scientistsstatements regarding specific scientific topics. For statements
regarding renewable energies, 44 % of participants have trustful attitudes, and
for statements regarding the genesis of the universe, 40 % have trustful attitudes.
Statements regarding climate change are trusted by 37 % of Germans, but for
genetically modified crops, only 16 % of participants regard such scientific state-
ments as trustworthy (Wissenschaft im Dialog [WiD] 2014).
Mingling Trust in Science and Personal Stances About Specific Topics The above-
mentioned survey questions exemplify that the public seems to trust Science less
when considering specific topics (nuclear energy, genetically modified food) then
when asked about Science in general. This might be due to the following reasons:
Firstly, many of the survey questions confound the personal stance about a
certain Science-based issue or development with the issue of trust in Science. For
example, an item like: “Do you think it is generally safe or unsafe to eat genetically
modified foods?” (Pew Research Center 2015, p. 92) is primarily an item on
personal beliefs and positions about this kind of food. It does not distinctly measure
the trust in the underlying Science, albeit this could also influence a participants
response. For teasing out both aspects, it is necessary to reflect on the difference
150 F. Hendriks et al.
f.hendriks@uni-muenster.de
between a personal position about a topic and personal trust in the Science that
produces knowledge about that topic. For example, the recent German
Wissenschaftsbarometer asked: “How much do you trust statements of scientists
regarding the topic renewable energies [authors translation]” (Wissenschaft im
Dialog [WiD] 2014, p. 14). The only way to view participantsresponses as a
statement about their trust or distrust in the scientific knowledge on this topic would
be if Science has provided a clear answer about this topic.
Secondly, typically surveys only focus on science- or technology-related topics
that are of public interest, and, as a result, these topics are controversially discussed
in the mass media. Participantsresponses might then reflect their degree of
awareness about the very fact that a topic is controversial, and this might be
confounded with their personal stance on the topic and the underlying Science.
We have doubts that this kind of confounding can be prevented by using the
following type of statement: “From what you know or have heard about renewable
energy, which of these statements, if any, most closely reflects your own opinion?—
saying benefits outweigh the risks/saying risks outweigh the benefits” (Castell
et al. 2014, p. 35).
Both of the above reasons for why the publics has a high general level of
defaulttrust in Science yet displays much more varied trust when considering
specific topics (including clear distrust by some subsamples) not only point to
methodological challenges of survey research, they also imply that trust in Science
is inherently confoundedwith peoples perspectives on the topic of interest: When
a science-related topic is of interest for segments of the public, then these
sub-populations develop personal stances related to this topic. These stances
thereby modify their defaulttrust in Science. In other words, trust in Science
develops and changes in light of the publics views about specific scientific topics.
4 From Trust to Epistemic Trust
Science (and science-based technology) is essential for life in modern societies, and
trust is an essential component of how the public copes with Science; consequently,
many recent representative surveys have tackled this topic. However, we have also
described that there is some tension between trust and the core idea that Science is a
means for freeing people from only relying on authorities to understand the world.
In the beginning of this chapter, we had already emphasized that, when it comes to
Science, the goods that are provided by the trustee to the trustor is knowledge, and
the risk to the trustor is that he/she is vulnerable to a lack of truth or validity of that
knowledge. In the following section, we will aim for a theoretical elaboration of
what we will call epistemic trust, starting with the influential Integrative Model of
Organizational Trust (Mayer et al. 1995), which is also discussed in most of the
Trust in Science and the Science of Trust 151
f.hendriks@uni-muenster.de
other chapters of this volume. From this, we will develop a definition of epistemic
trust, which draws on the work of Origgi (2004,2014) and Sperber et al. (2010).
Trust: A Rough Approximation Trust is defined by a dependence of a trusting actor
on the trusted person or entity (Tseng and Fogg 1999) combined with a vulnera-
bility to risk (Mayer et al. 1995). In consequence, the question of what makes a
person (the trustee) trustworthy to an interlocutor (the trustor) arises. Aristotle
defined the following three major character properties a person should possess to be
persuasive: “(1) practical intelligence [...], (2) a virtuous character, and (3) good
will” (Rapp 2010). Later, Mayer et al. (1995) summarized the literature on constit-
uents of interpersonal trust and the extensive work on the credibility of sources
(e.g., Hovland et al. 1953) and also arrived at three components that are believed to
make up the trustworthiness of a trustee: A trustee should possess (1) ability, the
domain-specific skills and competencies that enable the trustee to have influence
within the same domain, (2) benevolence, which describes her acting independently
from an egocentric profit motive and in a beneficial interest for the trustor, and
(3) integrity, i.e., she should act according to a set of rules or principles acceptable
to the trustor. According to the seminal model (Mayer et al. 1995), the three
dimensions are related but separable. Furthermore, the trustee isnt the only one
who must possess certain characteristics; in order to give trust, the trustor must hold
an attitude of a general willingness to trust others, a propensity to trust.
Epistemic Trust In the following section, we will use the term epistemic trustto
describe the trust in knowledge that has been produced or provided by scientists.
Such epistemic trust is unavoidably needed to gain knowledge (Resnik 2011;
Sperber et al. 2010). If someone doesnt have the chance to sense or learn some-
thing first-hand, she must defer to the testimony of first-hand sources (Hardwig
1991; Harris 2012; Schwab 2008).
Along these lines, researchers in developmental psychology have proposed that
children are very good at identifying whom to trust for gathering knowledge (Harris
2012; Keil et al. 2008; Mills 2013). We have the best evidence for the characteristic
of knowledgeability (or expertise) being a main constituent of how young childrens
trust in sources of knowledge. For example, when learning new object names,
3-year-olds prefer informants who have previously displayed accuracy in naming
objects (Koenig and Harris 2008). By the age of four, children remember previously
accurate informants and prefer to trust them over inaccurate, but familiar, infor-
mants (Corriveau and Harris 2009). Children are also sensitive to a sources
benevolence: At the age of four, children can infer the intent of an informant either
from behavior of informants or from being told of their moral character (Mascaro
and Sperber 2009), and they place selective trust in the most benevolent sources.
Thus, young children use informant expertise as well as helpfulness to make
trustworthiness judgments (Shafto et al. 2012), and sometimes the benevolence of
an informant even supersedes her expertise (Landrum et al. 2013). Furthermore,
when deciding whom to trust, kindergarteners take into consideration an
152 F. Hendriks et al.
f.hendriks@uni-muenster.de
informants self-interest (Mills and Keil 2005). In addition, recent studies show that
children as young as four take informant honesty (referring to a sources integrity)
into consideration when deciding to trust an information source (Lane et al. 2013;
Vanderbilt et al. 2012).
In the same vein, when it comes to adults and their trust in Science, placing
epistemic trust in someone means trusting her as a provider of information (Wilholt
2013). To minimize the risk of receiving wrong information, epistemic trust relies
on evidence that an interlocutor is trustworthy (Resnik 2011) and on some vigilance
to avoid the risk of being misinformed or cheated (Sperber et al. 2010). Hence, the
notion of epistemic trust is not built on receivers who uncritically accept the
authority of experts. In her work on the relations between epistemology and trust,
Origgi (2004,2012) argued that epistemic trust entails (1) a default trust, meaning
that people are generally trustful to others as a predisposition for communication
and cooperation, laying the groundwork for people to defer to the knowledge of
others, and (2) vigilant trust, which includes cognitive mechanisms that allow
people to make rather fine-grained ascriptions of trustworthiness before accepting
what others say. From Mayer et al. (1995) as well as from the childrens trust in
sources of knowledge, we can identify which features of a trustee (the source of
science-related information) might be processed within such cognitive mecha-
nisms, leading to the ascription of trustworthiness. But the Mayer et al. (1995)
model only roughly specifies these features. Thus, for a conceptual and an empirical
analysis of the emergence of trust in Science and its communication, we must
reconsider and specify these features. As has been argued before, laypeople might
take into account an experts expertise, benevolence and integrity while deciding if
to believe his/her statements on a science-related issue.
Thus, these components describe the features of experts that determine whether
recipients will depend on and defer to them when the recipientsown resources are
limited: First, a layperson should trust someone who is an expert because she is
knowledgeable (Lane et al. 2014); she possesses expertise. Expertise refers to
someones amount of knowledge and skill, but more than just the sheer quantities
of knowledge and skill is important: the person must also have the relevant
expertise. In other words, the dimension of expertise also encompasses the aspect
of pertinence (Bromme and Thomm 2015). Second, an expert should be trusted
when a layperson believes her to have a reliable belief-forming process (Schwab
2008; Wilholt 2013) and to follow the rules of her profession (Barber 1987;
Cummings 2014). These factors make up her perceived integrity. Third, an expert
is considered trustworthy if she offers advice or positive applications for the trustor
or (more generally) for the good of society (Resnik 2011; Whyte and Crease 2010);
that is, she must act with benevolence. Furthermore, when a layperson considers
trusting an expert, a persons propensity to trust can be equally assumed. One may
assume that people who display a high trust in Science may be more prone to rely on
experts when finding out about a science-related issue (Anderson et al. 2011).
Trust in Science and the Science of Trust 153
f.hendriks@uni-muenster.de
It is quite important for recipients not only to be able to identify speakers that
actually possess relevant expertise, but also to critically judge the intentions of such
speakers. In other words, recipients must be able to vigilantly identify sources
whose intentions might lead to a loss of benevolence or of integrity. For example,
due to vested interests, scientific evidence might be distorted by pseudo-evidence
produced by industry or policy stakeholders (e.g., evidence about smoking and
climate change, Lewandowsky et al. 2012). From this, we can conclude that trust in
scientists is not only based on features that are indicative of the epistemic quality of
their work (in a narrow sense with regard to the use of reliable processes of
knowledge acquisition), but also their moral integrity (Barber 1987; Hardwig
1991) as well as the usefulness of their work for the benefit of society (Resnik
2011).
Some empirical evidence supports that these three dimensions—expertise, integ-
rity, and benevolence—also come into play for adultstrust in knowledge that has
been produced or provided by scientists. In three studies, we have shown with
factor-analysis (exploratory and confirmatory) that when laypeople judge the epi-
stemic trustworthiness of scientists that are providing science-based information,
they indeed assess the scientistsexpertise, integrity, and benevolence (Hendriks
et al. 2015a). Furthermore, in an experimental study where we varied (fictitious)
scientistscharacteristics relating to those three dimensions, we showed that when
making these epistemic trustworthiness judgements, laypeople consider all three of
these dimensions in a differentiated way, again indicating that the dimensions are,
albeit interrelated, clearly distinct from each other (Hendriks et al. 2015a).
Also, qualitative data show that when laypeople are asked to make trust evalu-
ations about a scientific expert, they spontaneously report the scientists expertise,
objectivity or work ethic, and potential interests that stand in conflict with the
public (Cummings 2014). Furthermore, Peters et al. (1997) showed that when risks
are communicated to the public, laypeople again consider an organizations exper-
tise, integrity and benevolence. They also found that laypeoples trustworthiness
judgments about the industry (which is believed to care only about profits, but not
about public welfare) improves the most when the industry gives off an impression
of concern and care about society; the same is true for citizen organizations that
give off an impression of competence and knowledgeability. Thus, laypeople seem
to be especially vigilant when the trustee defies the trustors expectations (in this
special case, the negative stereotypes). This study shows that giving epistemic trust
is not only based on the characteristics of the trustee (the source of the science-
related information), but that these features are also weighed against the more
general expectations a trustee has about a specific trustor. This is only one example
of further conditions that constrain how source characteristics (expertise, integrity,
and benevolence) affect judgments of epistemic trust. Expectations about a trustors
intentions are also highly relevant in Science communication.
There is some evidence that the way in which scientific results are communi-
cated may matter for laypeoples assessment of a scientists communicative inten-
tions. In an experimental study, we investigated if trustworthiness perceptions were
affected (a) if a flaw (in this case, an overestimation of a studys generalizability)
154 F. Hendriks et al.
f.hendriks@uni-muenster.de
was disclosed, and (b) who disclosed it (Hendriks et al. 2015b). We found that on
the one hand, the participants discarded a scientistsexpertise if a flaw was
mentioned in a comment by an unaffiliated scientist (in contrast to no mention of
the flaw). But on the other hand, the scientists integrity and benevolence were rated
higher when the scientist himself disclosed the flaw (in contrast to when it was
disclosed by the unaffiliated scientist). With these results, we showed that a
scientists trustworthiness is judged in close relation to what evidence is known
that speaks to the characteristics expertise, integrity and benevolence. By actively
putting out such evidence (e.g., disclosing possible flaws themselves), scientists can
improve the publics judgments of their trustworthiness. Related results have been
found when scientists themselves (in contrast to scientists not affiliated with the
research) seem to be responsible for the communication of caveats or uncertainties
of their results (Jensen 2008).
Because the above-mentioned characteristics apply to judgments about an indi-
vidual scientist as well as whole scientific organizations (for example, research
institutes, universities or companies who do research; Peters et al. 1997), it may
well be assumed that the perception of expertise, integrity and benevolence influ-
ences trust in Science in general.
Interestingly, Science is a social as well as a cognitive entity. Science as a social
entity refers to the people who produce scientific insights (i.e., who do Science) and
to the organizations they work for, while Science as a cognitive entity refers to the
continuously developing body of knowledge that evolves from doing Science. As
an immediate consequence of its dual entity, it is inevitable that the assumed
trustworthiness of Science also depends on the publics appreciation of the knowl-
edge claims that are produced by Science. In other words, the assumed trustwor-
thiness of scientists depends on assumptions people already possess about what is
true knowledge and the new knowledge scientists provide. While discussing the
result from surveys that the public seems to be rather skeptical when it comes to
trust in Science about specific topics, we have already pointed to the inherent
confounding between peoples personal stance against the topics which are
researched and peoples trust into the Science which provides new scientific
insights about these topics.
This close entanglement between what is said (the content of Science) and who
has said it (the producers of Science) requires us to suspend the categorical
distinction between judgments about believability (of the knowledge claims pro-
vided by scientists) and trustworthiness (of the providing scientists). Of course,
when researching public trust in Science it is possible to scrutinize what laypeople
think about scientists, as well as what they think about the content of science-based
assertions. However, it is very likely that the provided answers will mostly mingle
both aspects, being determined by both. Given that modern sociology of Science
also conceives the truth of scientific knowledge as being dependent on its underly-
ing evidence as well as on the regulated discourse about this evidence (Longino
2002), the public might be on right track by considering both what is said with who
said it when they place trust in Science.
Trust in Science and the Science of Trust 155
f.hendriks@uni-muenster.de
Appendix
Table 1 Details about the surveys used for data on the publics trust in science and scientists
Region Title of report
Sponsoring
organization
Years of data
collection
Respondents
(n) Representative
USA In: Science and
Engineering
Indicators
2014,
Besley 2014.
Science and
Technology:
Public Under-
standing of
Science
National Sci-
ence Board
This overview
presents data
from numer-
ous surveys.
Unless other-
wise stated,
we use data
from the Gen-
eral Social
SurveysSci-
ence and
Technology
Module in
2012
1864–2256 Yes
USA Public and Sci-
entists Views
on Science and
Society
Pew Research
Center
2014 2002 Yes
Europe Special
Eurobarometer
401. Responsi-
ble Research
and Innovation
(RRI), Science
and
Technology
European
Commission
2013 27,563 from
27 member
states of the
European
Union (EU)
Yes
UK Public Atti-
tudes to
Science
Ipsos MORI 2013 1749 adults
and
315 young
adults
Yes, weighted
UK Wellcome
Trust Monitor,
Engaging with
science
Ipsos MORI 2012 1396 adults
(aged 18+)
and
460 young
people (aged
14–18)
Yes
Germany Wissenschafts-
barometer
2014
Wissenschaft
im Dialog
(WiD)
2014 1004 Yes, weighted
Sweden VA Barometer,
VA Report
2014:4
Vetenskap
and
Allma
¨nhet
2015 1000 Yes, weighted
156 F. Hendriks et al.
f.hendriks@uni-muenster.de
References
Anderson, A. A., Brossard, D., & Scheufele, D. A. (2010). The changing information environment
for nanotechnology: Online audiences and content. Journal of Nanoparticle Research, 12(4),
1083–1094. doi:10.1007/s11051-010-9860-2.
Anderson, A. A., Scheufele, D. A., Brossard, D., & Corley, E. A. (2011). The role of media and
deference to scientific authority in cultivating trust in sources of information about emerging
technologies. International Journal of Public Opinion Research, 24(2), 225–237. doi:10.1093/
ijpor/edr032.
Barber, B. (1987). Trust in science. Minerva, 25(1–2), 123–134. doi:10.1007/s11999-014-3488-y.
Besley, J. (2014). Science and technology: Public attitudes and understanding. In National Science
Board (Ed.), Science and engineering indicators 2014 (pp. 1–53). Arlington, VA: National
Science Foundation (NSB 14–01).
Bl
obaum, B. (2016). Key factors in the process of trust. On the analysis of trust under digital
conditions. In B. Bloebaum (Ed.), Trust and communication in a digitalized world. Models and
concepts of trust research. Berlin: Springer.
Bromme, R., & Goldman, S. R. (2014). The publics bounded understanding of science. Educa-
tional Psychologist, 49(2), 59–69. doi:10.1080/00461520.2014.921572.
Bromme, R., Kienhues, D., & Porsch, T. (2010). Who knows what and who can we believe?
Epistemological beliefs are beliefs about knowledge (mostly) to be attained from others. In
L. D. Bendixen & F. C. Feucht (Eds.), Personal epistemology in the classroom: Theory,
research, and implications for practice (pp. 163–193). Cambrigde: Cambridge University
Press.
Bromme, R., & Thomm, E. (2015). Knowing who knows: Laypersonscapabilities to judge
expertspertinence for science topics. Cognitive Science, 1–12. doi:10.1111/cogs.12252.
Bromme, R., Thomm, E., & Wolf, V. (2015). From understanding to deference: Laypersonsand
medical studentsviews on conflicts within medicine. International Journal of Science Edu-
cation, Part B: Communication and Public Engagement. doi:10.1080/21548455.2013.849017.
Castell, S., Charlton, A., Clemence, M., Pettigrew, N., Pope, S., Quigley, A., et al. (2014). Public
attitudes to science 2014. Ipsos Mori. London. Retrieved from https://www.ipsos-mori.com/
Assets/Docs/Polls/pas-2014-main-report.pdf
Chinn, C. A., Buckland, L. A., & Samarapungavan, A. (2011). Expanding the dimensions of
epistemic cognition: Arguments from philosophy and psychology. Educational Psychologist,
46(3), 141–167. doi:10.1080/00461520.2011.587722.
Corriveau, K., & Harris, P. L. (2009). Choosing your informant: Weighing familiarity and recent
accuracy. Developmental Science, 12(3), 426–37. doi:10.1111/j.1467-7687.2008.00792.x.
Critchley, C. R. (2008). Public opinion and trust in scientists: The role of the research context, and
the perceived motivation of stem cell researchers. Public Understanding of Science, 17(3),
309–327. doi:10.1177/0963662506070162.
Cummings, L. (2014). The “trust” heuristic: Arguments from authority in public health. Health
Communication, 34(1), 1–14. doi:10.1080/10410236.2013.831685.
Daston, L., & Galison, P. (2007). Objectivity. New York: Zone Books.
European Commission. (2013). Eurobarometer. Brussels. doi:10.4232/1.11873.
Eysenbach, G. (2008). Credibility of health information and digital media: New perspectives and
implications for youth. In M. J. Metzger & A. J. Flanagin (Eds.), Digital media, youth, and
credibility (The John D, pp. 123–154). Cambridge, MA: The MIT Press.
Feinstein, N. (2011). Salvaging science literacy. Science Education, 95(1), 168–185. doi:10.1002/
sce.20414.
Hardwig, J. (1991). The role of trust in knowledge. The Journal of Philosophy, 88(12), 693–708.
Harris, P. L. (2012). Trusting what youre told. Cambridge, MA: Belknap of Harvard UP.
Hendriks, F., Kienhues, D., & Bromme, R. (2015a). Measuring laypeoples trust in experts in a
digital age: The Muenster Epistemic Trustworthiness Inventory (METI). PLoS ONE, 10(10),
e0139309. doi:10.1371/journal.pone.0139309.
Trust in Science and the Science of Trust 157
f.hendriks@uni-muenster.de
Hendriks, F., Kienhues, D., & Bromme, R. (2015b). Disclose your flaws! Admission enhances
perceptions of trustworthiness of an expert blogger. Manuscript Submitted for Publication.
Hovland, C. I., Janis, I. L., & Kelley, H. H. (1953). Communication and persuasion. Psychological
issues of opinion change. New Haven: Yale University Press.
Jensen, J. D. (2008). Scientific uncertainty in news coverage of cancer research: Effects of hedging
on scientists and journalists credibility. Human Communication Research, 34(3), 347–369.
doi:10.1111/j.1468-2958.2008.00324.x.
Keil, F. C. (2010). The feasibility of folk science. Cognitive Science, 34(5), 826–862. doi:10.1111/
j.1551-6709.2010.01108.x.
Keil, F. C., Stein, C., Webb, L., Billings, V. D., Rozenblit, L., & Sciences, B. (2008). Discerning
the division of cognitive labor: An emerging understanding of how knowledge is clustered in
other minds. Cognitive Science, 32(2), 259–300. doi:10.1080/03640210701863339.
Koenig, M. A., & Harris, P. L. (2008). The basis of epistemic trust: Reliable testimony or reliable
sources? Episteme, 4, 264–284. doi:10.3366/E1742360008000087.
Landrum, A. R., Mills, C. M., & Johnston, A. M. (2013). When do children trust the expert?
Benevolence information influences childrens trust more than expertise. Developmental
Science, 16(4), 622–638. doi:10.1111/desc.12059.
Lane, J. D., Harris, P. L., Gelman, S. A., & Wellman, H. M. (2014). More than meets the eye:
Young childrens trust in claims that defy their perceptions. Developmental Psychology, 50(3),
865–871. doi:10.1037/a0034291.
Lane, J. D., Wellman, H. M., & Gelman, S. A. (2013). Informantstraits weigh heavily in young
childrens trust in testimony and in their epistemic inferences. Child Development, 84(4),
1253–68. doi:10.1111/cdev.12029.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation
and its correction: Continued influence and successful debiasing. Psychological Science in the
Public Interest, 13(3), 106–131. doi:10.1177/1529100612451018.
Longino, H. E. (2002). The fate of knowledge. Princeton, NJ: Princeton University Press.
Mascaro, O., & Sperber, D. (2009). The moral, epistemic, and mindreading components of
childrens vigilance towards deception. Cognition, 112, 367–380.
Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational
trust. The Academy of Management Review, 20(3), 709–734.
Mills, C. M. (2013). Knowing when to doubt: Developing a critical stance when learning from
others. Developmental Psychology, 49(3), 1–26. doi:10.1037/a0029500.Knowing.
Mills, C. M., & Keil, F. C. (2005). The development of cynicism. Psychological Science, 16(5),
385–390. doi:10.1111/j.0956-7976.2005.01545.x.
Origgi, G. (2004). Is trust an epistemological notion? Episteme, 1(1), 61–72. doi:10.3366/epi.
2004.1.1.61.
Origgi, G. (2010). Epistemic vigilance and epistemic responsibility in the liquid world of scientific
publications. Social Epistemology, 24(3), 149–159.
Origgi, G. (2012). Epistemic injustice and epistemic trust. Social Epistemology, 26(2), 221–235.
Origgi, G. (2014). Epistemic trust. In P. Capet & T. Delavallade (Eds.), Information evaluation
(1st ed., pp. 35–54). London: Wiley-ISTE.
Peters, R. G., Covello, V. T., & McCallum, D. B. (1997). The determinants of trust and credibility
in environmental risk communication: An empirical study. Risk Analysis, 17(1), 43–54. doi:10.
1111/j.1539-6924.1997.tb00842.x.
Pew Research Center. (2015). Public and scientistsviews on science and society.
Rapp, C. (2010). Aristotles rhetoric. Retrieved from http://plato.stanford.edu/entries/aristotle-
rhetoric/
Resnik, D. B. (2011). Scientific research and the public trust. Science and Engineering Ethics, 17
(3), 399–409. doi:10.1007/s11948-010-9210-x.
Rolin, K. (2002). Gender and trust in science. Hypathia, 17(4).
Schwab, A. P. (2008). Epistemic trust, epistemic responsibility, and medical practice. The Journal
of Medicine and Philosophy, 33(4), 302–20. doi:10.1093/jmp/jhn013.
158 F. Hendriks et al.
f.hendriks@uni-muenster.de
Shafto, P., Eaves, B., Navarro, D. J., & Perfors, A. (2012). Epistemic trust: Modeling childrens
reasoning about othersknowledge and intent. Developmental Science, 15(3), 436–47. doi:10.
1111/j.1467-7687.2012.01135.x.
Sperber, D., Cle
´ment, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., et al. (2010). Epistemic
vigilance. Mind & Language, 25(4), 359–393. doi:10.1111/j.1468-0017.2010.01394.x.
Stapel, D. (2012). Ontsporing. Amsterdam: Prometheus.
Tseng, S., & Fogg, B. (1999). Credibility and computing technology. Communications of the
ACM, 42(5), 39–44.
Vetenskap and Allmanhet. (2015). VA Barometer 2014/15. Stockholm.
Vanderbilt, K. E., Liu, D., & Heyman, G. D. (2012). The development of distrust. Child Devel-
opment, 82(5), 1372–1380. doi:10.1111/j.1467-8624.2011.01629.x.
Wellcome Trust. (2013). Engaging with science. The Wellcome Trust Monitor. Retrieved from
http://www.wellcome.ac.uk/stellent/groups/corporatesite/@msh_grants/documents/web_docu
ment/wtp052590.pdf
Whyte, K. P., & Crease, R. P. (2010). Trust, expertise, and the philosophy of science. Synthese,
177(3), 411–425. doi:10.1007/s11229-010-9786-3.
Wilholt, T. (2013). Epistemic trust in science. The British Journal for the Philosophy of Science,
64, 233–253. doi:10.1093/bjps/axs007.
Wissenschaft im Dialog. (WiD). (2014). Wissenschaftsbarometer 2014. Berlin.
Wolpert, L. (1992). The unnatural nature of science. Cambridge, MA: Harvard University Press.
doi:10.1016/0140-6736(93)92665-G.
Trust in Science and the Science of Trust 159
f.hendriks@uni-muenster.de
... Political interest significantly influences the consumption of science news, providing the public with greater opportunities to access scientific information (Hardy & Tallapragada, 2021). This increased exposure to science information allows individuals to better understand and trust science (Hendriks et al., 2016). Additionally, political interest levels can also indicate trust in social systems and intellectualism (Nisbet et al., 2002). ...
... Despite the aforementioned theoretical and practical implications, several limitations in the current study point to directions for future research. First, as suggested by Pechar et al. (2018) and Hendriks et al. (2016), lacking trust in science may vary by science issues. Indeed, our data found that the majority of distrustful individuals mentioned specific topics such as climate change and COVID-19 when they were asked to explain why they do not trust science, while the majority of trustful individuals just broadly mentioned that they trust science/scientists in general (rather than mentioning specific topics). ...
Preprint
Full-text available
Numerous studies have been conducted to identify the factors that predict trust/distrust in science. However, most of these studies are based on closed-ended survey research, which does not allow researchers to gain a more nuanced understanding of the phenomenon. This study combined survey analysis with computational text analysis to uncover factors previously hidden from survey studies. Even after controlling for political ideology, we found those with concerns over boundary-crossing (e.g., concerns or perceptions that science overlaps with politics, the government, and funding) were less likely to trust science than their counterparts.
... How research is funded constitutes an important information for laypeople, in that it directly informs their judgements of epistemic trustworthiness and "second hand evaluations" of science [19]. For instance, laypeople tend to show increased vigilance towards scientists if they are funded by private companies in comparison to public institutions [20]. Findings from the science barometer 2022 [21], a representative German population survey, show that 56% of participants agreed to dependency on funders as a reason to mistrust scientists. ...
Article
Full-text available
Plain Language Summaries (PLS) offer a promising solution to make meta-analytic psychological research more accessible for non-experts and laypeople. However, existing writing guidelines for this type of publication are seldom grounded in empirical studies. To address this and to test two versions of a new PLS guideline, we investigated the impact of PLSs of psychological meta-analyses on laypeoples’ PLS-related knowledge and their user experience (accessibility, understanding, empowerment). In a preregistered online-study, N = 2,041 German-speaking participants read two PLSs. We varied the inclusion of a disclaimer on PLS authorship, a statement on the causality of effects, additional information on community augmented meta-analyses (CAMA) and the PLS guideline version. Results partially confirmed our preregistered hypotheses: Participants answered knowledge items on CAMA more correctly when a PLS contained additional information on CAMA, and there were no user experience differences between the old and the new guideline versions. Unexpectedly, a priori hypotheses regarding improved knowledge via the use of a disclaimer and a causality statement were not confirmed. Reasons for this, as well as general aspects related to science communication via PLSs aimed at educating laypeople, are discussed.
... Political interest significantly influences the consumption of science news, providing the public with greater opportunities to access scientific information (Hardy and Tallapragada, 2021). This increased exposure to science information allows individuals to better understand and trust science (Hendriks et al., 2016). In addition, political interest levels can indicate trust in social systems and respect for various societal agents, including scientists (Nisbet et al., 2002). ...
Article
Full-text available
Numerous studies have been conducted to identify the factors that predict trust/distrust in science. However, most of these studies are based on closed-ended survey research, which does not allow researchers to gain a more nuanced understanding of the phenomenon. This study integrated survey analysis conducted within the United States with computational text analysis to reveal factors previously obscured by traditional survey methodologies. Even after controlling for political ideology—which has been the most significant explanatory factor in determining trust in science within a survey framework—we found those with concerns over boundary-crossing (i.e. concerns or perceptions that science overlaps with politics, the government, and funding) were less likely to trust science than their counterparts.
... Trust in scientists and their findings is crucial for modern societies (Hendriks et al., 2016) and key to effective science communication that can tackle global challenges such as pandemics (Algan et al., 2021) and the climate crisis (Cologna & Siegrist, 2020). Even though scientists, in general, enjoy a relatively high level of trust (Hoogeveen et al., 2022;B. ...
Article
Full-text available
Trust in science is polarized along political lines—but why? We show across a series of highly controlled studies (total N = 2,859) and a large-scale Twitter analysis ( N = 3,977,868) that people across the political spectrum hold stereotypes about scientists’ political orientation (e.g., “scientists are liberal”) and that these stereotypes decisively affect the link between their own political orientation and their trust in scientists. Critically, this effect shaped participants’ perceptions of the value of science, protective behavior intentions during a pandemic, policy support, and information-seeking behavior. Therefore, these insights have important implications for effective science communication.
Article
Innovation research fighting for public attention and counteracting science-skeptical views raise the need for insights into why individuals are motivated to engage with scientific knowledge. Guided by the Planned Risk Information Seeking Model (PRISM), additionally considering mistrust in science and innovativeness, the study aimed to explain individuals’ intention to seek information about medical innovations. Findings of an online survey among German residents ( N = 5,322) supported the utility of the extended PRISM to predict seeking intent. Most of the postulates of the PRISM were supported; mistrust served as a barrier to engagement with scientific knowledge, whereas innovativeness was of minor relevance.
Preprint
Full-text available
Abstract: Hydrologic modeling is an essential tool for analyzing the environmental effects of wildfires. Simulations of watershed behavior are uniquely suited to emergency assessments in which data are limited and time is scarce, such as those performed under the Burned Area Emergency Response (BAER) Program used by Federal Land Management Agencies in the United States. In these situations—when the values at risk (VARS) include lives and property—it is critical to remember: “All models are wrong, but some are useful” (Box and Draper, 1987). However, all too often, neither reports nor results rigorously reflect this imperative. With the wildfire crisis worsening each year, improving the state of the practice can be a strategic force multiplier for agencies, NGOs, and researchers alike. Herein, the twin questions of how wrong and how useful are used as the foundation for an overview of meaningful modeling within the context of postfire hydrologic assessments. Therefore, this paper focuses on how to: (1) think about watershed modeling, (2) select a modeling strategy, and (3) present the simulations in a meaningful way. The beginning and the end—the bread of a modeling sandwich. Nearly a third of the content is about science communication. While the focus is on burnt watersheds, BAER, and the US, the basic principles of modeling, grappling with uncertainty, and science communication are universal—and often not taught in many academic programs. [This provisional version has not undergone use testing or formal review by theUS Forest Service and will continue to evolve until the agency officially releases it. However, it was included as chapter 9 of Wheelock S.J. (2024) Marscapes to Terrestrial Moonscapes: A Variety of Water Problems."
Article
Il rapporto di fiducia tra scienza e società è da tempo oggetto di analisi. In questo lavoro si utilizza la chiave della fiducia nella scienza per esplorare le opinioni della comunità studentesca delle scuole secondarie italiane sull'Europa, il suo sistema di valori ? percepiti e desiderati ? e sul sentimento identitario. L'indagine ? Futuri per l'Educazione e l'Europeità ? è stata rea-lizzata nel 2021 dal CNR in collaborazione con il Ministero dell'Istruzione e del Merito e ha coinvolto le Consulte Provinciali degli Studenti italiane. I ri-sultati non solo evidenziano livelli elevati di fiducia nella scienza ma anche una relazione tra fiducia nella scienza e visioni valoriali rispetto all'Europa: un più forte sentimento europeista, una chiara apertura al mondo, una mag-giore attitudine alla partecipazione e alla solidarietà. Considerato infine che la fiducia nella scienza risulta più elevata nei licei che nei tecnici e professio-nali – e che i primi rispecchiano condizioni socio-economiche più favorevoli – emerge l'importanza della lotta alle disuguaglianze sia nel determinare la fiducia nella scienza che nella costruzione di una visione di Europa aperta, solidale, partecipata.
Article
Full-text available
Given their lack of background knowledge, laypeople require expert help when dealing with scientific information. To decide whose help is dependable, laypeople must judge an expert's epistemic trustworthiness in terms of competence, adherence to scientific standards, and good intentions. Online, this may be difficult due to the often limited and sometimes unreliable source information available. To measure laypeople's evaluations of experts (encountered online), we constructed an inventory to assess epistemic trustworthiness on the dimensions expertise, integrity, and benevolence. Exploratory (n = 237) and confirmatory factor analyses (n = 345) showed that the Muenster Epistemic Trustworthiness Inventory (METI) is composed of these three factors. A subsequent experimental study (n = 137) showed that all three dimensions of the METI are sensitive to variation in source characteristics. We propose using this inventory to measure assignments of epistemic trustworthiness, that is, all judgments laypeople make when deciding whether to place epistemic trust in-and defer to-an expert in order to solve a scientific informational problem that is beyond their understanding.
Chapter
In academic debate, trust is modeled either as a state or as a relation between trustee and trustor. This chapter systematizes a number of key features that influence the process of trust in order to make visible the tasks of future research. It discusses the differentiation of objects of trust (system, organization, role holders, and performance or product). The various factors pertaining to the trustee that influence the trustee’s trustworthiness are presented. With reference to knowledge and experiences, personality features, situational features, and the differentiation of various communication situations, the chapter describes which elements pertaining to the trustor influence trust. Trust only becomes risky when it manifests itself in the form of an action. Only a person who acts risks something and makes himself or herself vulnerable and dependent on the trustee. The chapter explains what effects digitalization has on the development of trust. On the one hand, digital possibilities for developing equivalents of trust are opening up; on the other hand, trustees have to find new forms of presenting their trustworthiness.
Article
This chapter shows how information evaluation relates to the general epistemological question of the role of trust in knowledge. It presents a sketch of the social epistemology research program, before focusing on the epistemological dimension of trust. Epistemological externalism approach centered on causality aims at understanding which are the "reliable" processes-that is, those which favor truth-in the formation of beliefs. Approaches which are critical of conventional epistemology can be concerned by truth and realism. According to the philosopher Paul Grice, conversational exchange requires at least a form of trust in the other person, that is, an assumption of the willingness of the interlocutor to cooperate for the success of the communication.
Article
This paper offers a new interpretation of the first chapter of Aristotle's Rhetoric and of Aristotle's understanding of rhetoric throughout the treatise. I defend the view that, for Aristotle, rhetoric was a skill in offering the listener 'proofs' (pisteis), that is, proper grounds for conviction. His arguments in the opening chapters of the treatise state and defend this controversial, epistemically normative view against the rival views of Gorgias, Thrasymachus and the rhetorical handbook writers, on the one hand, and against those of Plato, on the other. Aristotle defends his view on the basis that rhetoric is a skill in discharging an important role in the state - the role of helping citizens to good publicly-deliberated judgements.
Article
Because modern societies are built on elaborate divisions of cognitive labor, individuals remain laypersons in most knowledge domains. Hence, they have to rely on others' expertise when deciding on many science-related issues in private and public life. Even children already locate and discern expertise in the minds of others (e.g., Danovitch & Keil, 2004). This study examines how far university students accurately judge experts' pertinence for science topics even when they lack proficient knowledge of the domain. Participants judged the pertinence of experts from diverse disciplines based on the experts' assumed contributions to texts adapted from original articles from Science and Nature. Subjective pertinence judgments were calibrated by comparing them with bibliometrics of the original articles. Furthermore, participants' general science knowledge was controlled. Results showed that participants made well-calibrated pertinence judgments regardless of their level of general science knowledge. © 2015 Cognitive Science Society, Inc.