ArticlePDF Available

"You Talkin' to Me?": Self-Relevant Auditory Signals Influence Perception of Gaze Direction

Authors:

Abstract and Figures

In humans, direct gaze typically signals a deliberate attempt to communicate with an observer. An auditory signal with similar signal value is calling someone's name. We investigated whether the presence of this personally relevant signal in the auditory modality would influence perception of another individual's gaze. Participants viewed neutral faces displaying different gaze deviations while hearing someone call their own name or the name of another person. Results were consistent with our predictions, as participants judged faces with a wider range of gaze deviations as looking directly at them when they simultaneously heard their own name. The influence of this personally relevant signal was present only at ambiguous gaze deviations; thus, an overall response bias to categorize gaze as direct when hearing one's own name cannot account for the results. This study provides the first evidence that communicative intent signaled via the auditory modality influences the perception of another individual's gaze.
Content may be subject to copyright.
Psychological Science
21(12) 1765 –1769
© The Author(s) 2010
Reprints and permission:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/0956797610388812
http://pss.sagepub.com
Eye gaze not only carries information regarding a person’s
direction of attention, but also provides clues to his or her
future intentions and actions (Baron-Cohen, 1995). The ability
to discriminate direct from averted gaze is present from birth
(Farroni, Csibra, Simion, & Johnson, 2002). In adults, direct
gaze potentiates detection of faces (Senju & Hasegawa, 2005),
as well as discrimination of their gender (Macrae, Hood,
Milne, Rowe, & Mason, 2002) and identity (Hood, Macrae,
Cole-Davies, & Dias, 2003). Direct gaze may facilitate per-
ception because it is a deliberate, or ostensive, signal (Sperber
& Wilson, 1995) that indicates to the observer that something
of importance is about to be communicated (Frith, 2008).
However, gaze direction is only one component of a much
richer set of social signals from visual and auditory modalities,
including facial expression, body posture, and verbal and non-
verbal vocal information.
In the auditory modality, a signal with value similar to
that of direct gaze is hearing one’s own name (Moray, 1959).
Although sensitivity to one’s own name emerges over the first
year of life (Mandel, Jusczyk, & Pisoni, 1995), sensitivity to
another auditory ostensive cue―being spoken to in a slow,
deliberate manner (motherese)―is present in the first month
after birth (Cooper & Aslin, 1990). In a study indicative of the
common ostensive value of these auditory and gaze cues,
infants followed an adult’s gaze to objects in their environ-
ment only if the gaze shift was preceded by an ostensive sig-
nal, such as a period of direct eye contact or infant-directed
speech (Senju & Csibra, 2008). Note that these two signals
not only have a similar function but also often co-occur. For
example, a mother would typically speak in motherese as she
engages her infant in mutual gaze. Similarly, in order to attract
the attention of an adult, people often call the person’s name as
they make direct eye contact with him or her. An important,
but as yet unaddressed, question arising from the natural co-
occurrence and shared value of these two signals is whether
hearing one’s own name would enhance one’s perception of
direct gaze in the visual modality.
Previous work has shown that visual context may affect
where one thinks others are looking. For example, Lobmaier,
Fischer, and Schwaninger (2006) showed that an object near
the line of sight causes the perceived gaze direction to
Corresponding Author:
Raliza S. Stoyanova, MRC Cognition and Brain Sciences Unit, 15 Chaucer
Rd., Cambridge CB2 7EF, United Kingdom
E-mail: raliza.stoyanova@mrc-cbu.cam.ac.uk
“You Talkin’ to Me?”: Self-Relevant
Auditory Signals Influence Perception
of Gaze Direction
Raliza S. Stoyanova, Michael P. Ewbank, and Andrew J. Calder
MRC Cognition and Brain Sciences Unit, Cambridge, England
Abstract
In humans, direct gaze typically signals a deliberate attempt to communicate with an observer. An auditory signal with
similar signal value is calling someone’s name. We investigated whether the presence of this personally relevant signal in
the auditory modality would influence perception of another individual’s gaze. Participants viewed neutral faces displaying
different gaze deviations while hearing someone call their own name or the name of another person. Results were consistent
with our predictions, as participants judged faces with a wider range of gaze deviations as looking directly at them when they
simultaneously heard their own name. The influence of this personally relevant signal was present only at ambiguous gaze
deviations; thus, an overall response bias to categorize gaze as direct when hearing one’s own name cannot account for the
results. This study provides the first evidence that communicative intent signaled via the auditory modality influences the
perception of another individual’s gaze.
Keywords
face perception, facial features, auditory perception, social cognition
Received 1/26/10; Revision accepted 6/18/10
Research Report
1766 Stoyanova et al.
gravitate toward that object, presumably because one expects
people to be attending to objects rather than to an empty space.
Of more relevance to the current study is research showing
that the range of gaze deviations that participants perceive to
be directed at themselves is affected by facial expression, with
angry expressions increasing this range relative to fearful
or neutral expressions (Ewbank, Jennings, & Calder, 2009;
Lobmaier, Tiddeman, & Perrett, 2008). To the best of our
knowledge, no prior research has addressed whether any form
of auditory cue exerts an influence on gaze discrimination.
However, there is good reason to think that this may be the
case, in view of the principle that social stimuli with the same
signal value (Adams & Kleck, 2003, 2005) influence the per-
ception of one another. Moreover, recent work suggests that
the prefrontal region involved in inferring the mental states of
other people is engaged both in response to hearing one’s own
name (vs. another name) and in response to viewing direct
gaze (vs. averted gaze; Kampe, Frith, & Frith, 2003).
Kampe et al. (2003) suggested that visual and auditory
ostensive cues activate theory-of-mind computations in order
to facilitate the interpretation of subsequent communication.
However, Conty, N’Diaye, Tijus, and George (2007) have
reported that the prefrontal response to direct gaze temporally
precedes the response in perceptual regions, such as the supe-
rior temporal sulcus, that are thought to be involved in coding
the direction (Calder et al., 2007; Nummenmaa & Calder,
2009) or intentionality (e.g., Pelphrey, Viola, & McCarthy,
2004) conveyed by gaze. This suggests that the analysis of
gaze direction may be amenable to top-down influences from
prefrontal regions that are sensitive to direct gaze and the rel-
evance of one’s name. On the basis of these findings and the
observation that social stimuli with shared signal value can
influence the perception of one another (Adams & Kleck,
2003, 2005), we hypothesized that hearing one’s own name,
relative to another name, would increase the range over which
one feels that another individual’s gaze is directed at oneself.
To test this hypothesis, we had participants categorize the
gaze direction of neutral faces whose gaze varied from left to
right in small, incremental steps. Faces were presented indi-
vidually and were accompanied by auditory presentation of
the participant’s own name or another person’s name. Psycho-
metric functions, fitted to the proportion of “left,” “right,” and
“direct” responses, provided an objective dependent measure
of the range of gaze deviations perceived as direct when
participants heard their own name and when they heard
another person’s name.
Method
Participants
Eighteen volunteers with normal or corrected-to-normal vision
(6 male, 12 female; mean age = 24.7 years, SD = 4.92 years)
were recruited from the MRC Cognition and Brain Sciences
Unit Volunteer Panel. They provided written and informed
consent and were paid for participating. One participant did
not complete the entire experiment and was excluded from the
analysis, leaving a final sample of 17 (6 male, 11 female; mean
age = 24.8 years, SD = 5.03 years).
Stimuli
The face stimuli consisted of gray-scale photographs of four
males posing neutral expressions. The photographs were
selected from the NimStim Face Stimulus Set (Tottenham et al.,
2009) and the Karolinska Directed Emotional Faces database
(Lundqvist, Flykt, & Öhman, 1998). Nonfacial areas and hair
were masked, leaving the central face area visible. The facial
images subtended a visual angle of 12° by 8°. Following pre-
vious research (Adams & Kleck, 2005; Ewbank et al., 2009),
we manipulated gaze by altering the position of the iris of each
eye in incremental steps of 1 pixel per image using Adobe
Photoshop. This alteration is equivalent to a shift of 0.03 cm,
or a visual angle of approximately 1/12°. We used a total of
seven gaze deviations: true direct gaze plus shifts of 4, 7, and
10 pixels for left and right gaze (see Fig. 1). These deviations
were chosen on the basis of a previous study showing that true
direct gaze is largely perceived as direct, gaze that is 10 or
more pixels from direct is mostly perceived as averted, and
gaze that is 4 pixels from direct is perceived as direct and
averted equally often (Ewbank et al., 2009).
Auditory stimuli consisted of recordings of four male
native English speakers calling participants’ first names (e.g.,
“Sarah!”), as well as a set of control names. Recordings were
made in a soundproof booth. For each participant, three gender-
and syllable-matched names served as controls. There was no
significant difference (p > .3) in the duration of participants’
own names (M = 571.65 ms, SE = 8.9 ms) and the duration of
Left 10 Left 7 Left 4 Direct Right 4 Right 7 Right 10
Fig. 1. Examples of the stimuli. Each facial identity displayed a neutral expression at seven gaze deviations: 10 pixels to the left, 7 pixels to the left, 4
pixels to the left, direct gaze, 4 pixels to the right, 7 pixels to the right, and 10 pixels to the right.
Self-Relevant Signals Influence Gaze Perception 1767
control names (M = 562.88 ms, SE = 4.75 ms). All sound files
were normalized in amplitude, using Adobe Audition 2 (http://
www.adobe.com).
Procedure
Participants were seated 50 cm in front of a computer monitor.
A chin rest was used to maintain head position and distance
from the screen. Each trial began with a central fixation cross,
which remained on-screen for between 500 and 1,750 ms (M =
1,125 ms). A variable duration was used in order to reduce
expectancy effects. The fixation cross was followed by a cen-
trally presented face for 600 ms. The face was displayed on a
gray background and was presented together with an auditory
name delivered through Sennheiser (Weddemark, Germany)
HD465 stereo headphones. Following the offset of the face
and name, there was a 2,000-ms interval during which partici-
pants were required to press one of three buttons according to
whether they perceived the face to be looking to their left, to
their right, or directly at them. Participants were instructed that
auditory stimuli were irrelevant to the task and that they should
concentrate on categorizing the direction of the gaze as accu-
rately as possible; speed of response was not emphasized.
There were a total of 256 randomly presented trials: Sixty-four
trials presented direct gaze, and 64 presented each of the devi-
ated gaze directions (4, 7, and 10 pixels). The trials presenting
deviated gaze were equally divided between left-oriented and
right-oriented gaze. At each gaze deviation, the participant’s
own name and the three control names each appeared on one
quarter of the trials.
Results
For each participant, separate logistic functions were fitted to
the proportion of “left” and “right” responses as a function of
gaze deviation. A function for “direct” responses was calcu-
lated by subtracting the sum of “left” and “right” responses
from 1 at each gaze deviation. From these data, we calculated
the crossover point between the fitted “direct” and “left” func-
tions and between the fitted “direct” and “right” functions (see
Fig. 2). The sum of these two absolute values provided an
objective measure of the range of gaze deviations that each
participant perceived to be direct for each of two conditions:
the own-name condition and the other-name condition. Fol-
lowing Gamer and Hecht (2007), who likened this range to a
cone, we refer to these as cone-of-gaze values.
A repeated measures analysis of variance (ANOVA)
revealed a significant main effect of condition, F(1, 16) =
4.61, p < .05, η
p
2
= .22, reflecting a wider cone of gaze for the
own-name condition (M = 7.5 pixels, SE = 0.47) than for the
other-name condition (M = 6.8 pixels, SE = 0.41). Although
the effect of condition appeared to be slightly larger for
rightward than for leftward gaze deviations, the difference
between the crossover points for the own- and other-name
conditions did not differ significantly between rightward and
leftward deviations, t(16) = −1.03, p = .32, two-tailed. Hence,
–10 –8 –6 –4 –2 0 2 4 6 8 10
.0
1.0
Proportion of Responses
Own Name
Other Nam
e
“Right” Responses“Left” Responses “Direct” Responses
.1
.2
.3
.4
.5
.6
.7
.8
.9
Gaze Deviation
Fig. 2. Plot showing the mean fitted logistic functions for “left,“right,” and “direct” responses in the own-name condition and the other-name condition.
The dashed lines indicate the crossover points, and the arrows indicate the width of the cone of gaze.
1768 Stoyanova et al.
the following analyses collapsed across leftward and right-
ward gaze deviations.
To exclude the possibility that the effects reflected an over-
all bias for participants to label gaze as direct when they heard
their own name, we entered the proportion of “direct”
responses into a 2 × 4 ANOVA with condition (own-name or
other-name) and gaze deviation (0, 4, 7, or 10 pixels) as
repeated measures factors. This analysis revealed the expected
main effect of gaze deviation, F(3, 48) = 408.86, p < .001, η
p
2
=
.96, and a trend for an effect of condition, F(1, 16) = 4.21, p =
.06, η
p
2
= .21. Critically, there was also a significant interac-
tion between the two factors, F(3, 48) = 7.94, p < .001, η
p
2
=
.33. Simple-effects analyses, at each gaze deviation, showed
an effect of condition only at the 4-pixel deviation, F(1, 16) =
9.61, p < .01, η
p
2
= .38, for which the perceived direction of
gaze was somewhat ambiguous. There was no effect of condi-
tion for faces that showed true direct gaze (0 pixels, F < 1), nor
for faces with gaze deviations of 7 or 10 pixels (Fs < 2.13,
ps > .16). Thus, there was no evidence that hearing one’s own
name resulted in an overall bias to respond “direct.”
Discussion
This study provides the first evidence that an ostensive signal in
the auditory modality (calling a person’s name) has an influence
on the perception of an ostensive signal with which it frequently
co-occurs in the visual modality (direct gaze). As we predicted,
participants reported direct gaze over a wider range of gaze
deviations when they heard their own name than when they
heard the name of another person. Further analyses showed that
this auditory signal did not affect responses when gaze was
clearly direct or clearly averted, providing no evidence of an
overall bias to respond “direct” in response to hearing one’s
own name. Instead, the effect was maximal when gaze direction
was intermediate, or ambiguous (i.e., the 4-pixel gaze devia-
tion). This pattern accords with previous research showing that,
within the visual modality, facial expression has an influence on
the perception of gaze when gaze direction is relatively difficult
to discriminate (Ewbank et al., 2009), and, similarly, that gaze
has a greater reciprocal influence on the processing of expres-
sion when the expression is more difficult to discriminate (Gra-
ham & LaBar, 2007). Similar effects have also been found in the
multisensory perception of emotional expression: Signals in the
nontarget modality exert a larger influence on perception of sig-
nals in the target modality when the latter are degraded or
ambiguous (e.g., Collignon et al., 2008; de Gelder & Vroomen,
2000). Future research should examine whether the bidirec-
tional influence between visual and auditory emotional cues
extends to ostensive cues. For example, would direct gaze also
increase the probability of detecting ambiguous or degraded
presentations of one’s own name?
Although gaze direction is an important cue to the current
and future intentions of individuals in one’s environment
(Baron-Cohen, 1995), it occurs in a rich visual and auditory
context. It is important to note that some of the signals that
co-occur with gaze convey the same behavioral intent. Within
the visual modality, for example, perception of an angry or
joyful facial expression is facilitated by direct gaze, as both
signals are associated with approach-related behavior (Adams
& Kleck, 2003; Graham & LaBar, 2007). By contrast, percep-
tion of fearful facial expressions may be facilitated when they
display averted gaze, because both fearful expressions and
averted gaze signal avoidance (Adams & Kleck, 2005; but see
Bindemann, Burton, & Langton, 2008). Similarly, as already
discussed, a reciprocal influence of angry expressions on gaze
perception is also found; observers are more likely to perceive
direct gaze in an angry facial expression than in a fearful or
neutral facial expression (Ewbank et al., 2009; Lobmaier et al.,
2008). The present study builds on this work by showing that
without any change in the visual characteristics of a face, a
concurrent auditory signal that conveys the intent to commu-
nicate can influence the perception of direct gaze, a visual sig-
nal conveying the same behavioral intent.
An interesting empirical question is whether the effect of
auditorily presented names on gaze discrimination is limited
to participants’ own names or whether it extends to other peo-
ple’s names. For example, might participants be more likely to
perceive ambiguous gaze as averted toward a friend sitting
beside them if the friend’s name was called out? This would
provide evidence that the range of gaze deviations perceived
as direct can be both widened and narrowed and would also
discount the idea that familiarity of the name alone can explain
the current results. It would also suggest that the reported
effect is not limited to self-relevant cues, but is a more general
property of cues that share intent and often co-occur.
Another direction for future research concerns the neural
mechanisms through which visual and auditory ostensive sig-
nals may interact. As suggested by research showing that the
prefrontal response to direct gaze is earlier than the superior
temporal response (Conty et al., 2007), it is possible that the
increased perception of direct gaze when one hears one’s own
name is a result of a top-down modulation of superior tempo-
ral regions involved in gaze perception. Although a number of
researchers have suggested that prefrontal mentalizing regions
might be a source of top-down signals in tasks that involve
making inferences about other individuals’ intentions (Frith
& Frith, 2006; Nummenmaa & Calder, 2009; Teufel et al.,
2009), this remains to be established empirically. Finally,
given infants’ early sensitivity to both visual and auditory
ostensive signals (e.g., Senju & Csibra, 2008), further work,
using behavioral and neuroimaging methods, could also exam-
ine at what point in development ostensive cues in one modal-
ity begin exerting their influence on perception of ostensive
cues in another modality.
In summary, we have shown that gaze is more likely to be
perceived as direct when accompanied by auditory presenta-
tion of one’s own name, a signal that shares similar behavioral
intent and frequently co-occurs with direct gaze. This suggests
that perception of another person’s gaze direction is affected
not only by salient visual cues, such as facial expression (e.g.,
Self-Relevant Signals Influence Gaze Perception 1769
Ewbank et al., 2009; Lobmaier et al., 2008), but also by audi-
tory cues with similar signal value.
Acknowledgments
R.S.S. holds a Gates Cambridge Scholarship and an Overseas
Research Studentship. We thank Simon Strangeways for preparing
the visual stimuli and Colin W. Clifford for helping with the logistic
function analysis.
Declaration of Conflicting Interests
The authors declared that they had no conflicts of interest with
respect to their authorship or the publication of this article.
Funding
This research was funded by the United Kingdom Medical Research
Council (U.1055.02.001.0001.01 to A.J.C.).
References
Adams, R.B., & Kleck, R.E. (2003). Perceived gaze direction and the
processing of facial displays of emotion. Psychological Science,
14, 644–647.
Adams, R.B., & Kleck, R.E. (2005). Effects of direct and averted
gaze on the perception of facially communicated emotion. Emo-
tion, 5, 3–11.
Baron-Cohen, S. (1995). Mindblindness: An essay on autism and
theory of mind. Boston, MA: MIT Press.
Bindemann, M., Burton, A., & Langton, S.R.H. (2008). How do eye
gaze and facial expression interact? Visual Cognition, 16, 708–733.
Calder, A.J., Beaver, J.D., Winston, J.S., Dolan, R.J., Jenkins, R.,
Eger, E., et al. (2007). Separate coding of different gaze direc-
tions in the superior temporal sulcus and inferior parietal lobule.
Current Biology, 17, 20–25.
Collignon, O., Girard, S., Gosselin, F., Roy, S., Saint-Amour, D.,
Lassonde, M., et al. (2008). Audio-visual integration of emotion
expression. Brain Research, 1242, 126–135.
Conty, L., N’Diaye, K., Tijus, C., & George, N. (2007). When eye
creates the contact! ERP evidence for early dissociation between
direct and averted gaze motion processing. Neuropsychologia,
45, 3024–3037.
Cooper, R.P., & Aslin, R.N. (1990). Preference for infant-directed
speech in the first month after birth. Child Development, 61,
1584–1595.
de Gelder, B., & Vroomen, J. (2000). The perception of emotions by
ear and by eye. Cognition & Emotion, 14, 289–311.
Ewbank, M.P., Jennings, C., & Calder, A.J. (2009). Why are you angry
with me? Facial expressions of threat influence perception of gaze
direction. Journal of Vision, 9(12), Article 16. Retrieved November
29, 2009, from http://www.journalofvision.org/content/9/12/16.full
Farroni, T., Csibra, G., Simion, F., & Johnson, M.H. (2002). Eye con-
tact detection in humans from birth. Proceedings of the National
Academy of Sciences, USA, 99, 9602–9605.
Frith, C.D. (2008). Social cognition. Philosophical Transactions of
the Royal Society B: Biological Sciences, 363, 2033–2039.
Frith, C.D., & Frith, U. (2006). How we predict what other people are
going to do. Brain Research, 1079, 36–46.
Gamer, M., & Hecht, H. (2007). Are you looking at me? Measuring
the cone of gaze. Journal of Experimental Psychology: Human
Perception and Performance, 33, 705–715.
Graham, R., & LaBar, K.S. (2007). Garner interference reveals
dependencies between emotional expression and gaze in face
perception. Emotion, 7, 296–313.
Hood, B.M., Macrae, C.N., Cole-Davies, V., & Dias, M. (2003). Eye
remember you: The effects of gaze direction on face recognition
in children and adults. Developmental Science, 6, 67–71.
Kampe, K.K.W., Frith, C.D., & Frith, U. (2003). “Hey John”: Signals
conveying communicative intention toward the self activate brain
regions associated with “mentalizing,” regardless of modality.
The Journal of Neuroscience, 23, 5258–5263.
Lobmaier, J.S., Fischer, M.H., & Schwaninger, A. (2006). Objects
capture perceived gaze direction. Experimental Psychology, 53,
117–122.
Lobmaier, J.S., Tiddeman, B.P., & Perrett, D.I. (2008). Emotional
expression modulates perceived gaze direction. Emotion, 8,
573–577.
Lundqvist, D., Flykt, A., & Öhman, A. (1998). The Karolinska
Directed Emotional Faces—KDEF [CD-ROM]. Stockholm,
Sweden: Karolinska Institute.
Macrae, C.N., Hood, B.M., Milne, A.B., Rowe, A.C., & Mason, M.F.
(2002). Are you looking at me? Eye gaze and person perception.
Psychological Science, 13, 460–464.
Mandel, D.R., Jusczyk, P.W., & Pisoni, D.B. (1995). Infants’ recog-
nition of the sound patterns of their own names. Psychological
Science, 6, 314–317.
Moray, N. (1959). Attention in dichotic listening: Affective cues and
the influence of instructions. Quarterly Journal of Experimental
Psychology, 11, 56–60.
Nummenmaa, L., & Calder, A.J. (2009). Neural mechanisms of social
attention. Trends in Cognitive Sciences, 13, 135–143.
Pelphrey, K.A., Viola, R.J., & McCarthy, G. (2004). When strangers
pass: Processing of mutual and averted social gaze in the superior
temporal sulcus. Psychological Science, 15, 598–603.
Senju, A., & Csibra, G. (2008). Gaze following in human infants
depends on communicative signals. Current Biology, 18, 668–
671.
Senju, A., & Hasegawa, T. (2005). Direct gaze captures visuospatial
attention. Visual Cognition, 12, 127–144.
Sperber, D., & Wilson, D. (1995). Relevance: Communication and
cognition (2nd ed.). Oxford, England: Blackwell.
Teufel, C., Alexis, D.M., Todd, H., Lawrance-Owen, A.J., Clayton,
N.S., & Davis, G. (2009). Social cognition modulates the sen-
sory coding of observed gaze direction. Current Biology, 19,
1274–1277.
Tottenham, N., Tanaka, J.W., Leon, A.C., McCarry, T., Nurse, M.,
Hare, T.A., et al. (2009). The NimStim set of facial expressions:
Judgments from untrained research participants. Psychiatry
Research, 168, 242–249.
... Furthermore, and highly relevant for the present study, these studies also showed that the selfrelevance of the contents of the speech had an effect on the perception of gaze direction. Stoyanova et al. (2010) presented a series of gaze deviation stimuli accompanied by speech stimuli in which someone was calling either the participant's name or another person's name. The results showed that participants judged faces with a wider range of gaze deviations as looking at them when they simultaneously heard their own name versus another's name. ...
... We adopted the gaze cone paradigm in which we presented a series of face stimuli (or arrows) with varying angular deviations accompanied by speech stimuli (short utterances) in which the speaker was expressing either a positive emotion ("I like") or a negative emotion ("I hate") and was addressing either the participant ("you") or somebody else ("him"/ "her"). Based on the self-referential positivity hypothesis and empirical findings by Perrett (2011) andLobmaier et al. (2008) and on the findings of the effects of self-referential speech on gaze perception (Stoyanova et al., 2010;Vida & Maurer, 2013), we anticipated that the cone of gaze would be wider in the context of positive versus negative emotional speech when the speech is self-relevant as compared to when it is other-relevant. When judging whether the self is a target or not, it is possible that contextual speech always influences the directional judgments, regardless of the stimulus type. ...
... The present findings successfully repeated previous research showing a wider cone of gaze in the context of self-directed versus other-directed speech (Stoyanova et al., 2010;Vida & Maurer, 2013). A more interesting and novel finding was that we found judgments of gaze perception to be modulated by emotional speech, but only when the speech was targeted to the participants themselves; in this situation the cone of gaze was wider in emotionally positive than negative context. ...
Article
Full-text available
The perception of another individual’s gaze direction is not a low-level, stimulus-driven visual process but a higher-level process that can be top-down modulated, for example, by emotion and theory of mind. The present study investigated the influence of directed (self vs. other) and emotional (positive vs. negative) speech on judging whether another individual’s gaze or an arrow is directed toward the self or not. Experiments 1 and 2 showed that participants perceived a wider range of gaze deviations as looking at them when the speech was directed to themselves versus others. Importantly, the emotion in speech also impacted gaze judgments, but only when the speech was related to the participants themselves: the gaze cone was greater for positive than for negative self-relevant speech. This pattern of results was observed regardless of whether the speech was task-relevant (Experiment 1) or task-irrelevant (Experiment 2). Additionally, the results from Experiment 3 showed that the directed and emotional information in the speech had no impact on the judgments of the direction of an arrow. These findings expand our knowledge of the interaction between the perception of emotions and gaze direction and emphasize the significance of self-relevance in modulating this interaction.
... Disruption to either of these processes could result in atypical gaze perception. For example, temporary or situation-dependent factors that add perceptual noise (e.g., due to distance or lighting intensity [17]) can reduce the precision of gaze perception, and those that evoke a self-related belief (e.g., when one's name is being called [18]) can increase the likelihood of perceiving selfdirected gaze. Additionally, in psychiatric disorders in which abnormal gaze perception is well-documented (e.g., schizophrenia, ASD, and social anxiety), a large body of evidence also suggests disruptions in either/both visual perception and/or self-referential processing (e.g., dysfunction in low-level visual processing in ASD [19]; dysregulated self-referential tendencies in social anxiety [20,21]; deficits in both low-level visual processing [22,23] and higher-level self-referential processes [24] in schizophrenia). ...
... Another issue with previous eye contact perception studies deals with sample size. Most previous studies have used small samples, with the majority consisting of fewer than 20 participants (e.g., [9,18,30,[36][37][38][39]) and a significant portion using samples of 10 or less (e.g., [17,26,31,33,35,40]). Besides limiting the generalizability of the findings, small sample sizes also preclude the investigation of important questions about eye contact perception, such as within-subject reliability over time, relationships to other social cognitive functions, sex differences, and age effect. ...
Article
Full-text available
Eye contact perception—the ability to accurately and efficiently discriminate others’ gaze directions—is critical to understanding others and functioning in a complex social world. Previous research shows that it is affected in multiple neuropsychiatric disorders accompanied by social dysfunction, and understanding the cognitive processes giving rise to eye contact perception would help advance mechanistic investigations of psychopathology. This study aims to validate an online, psychophysical eye contact detection task through which two constituent cognitive components of eye contact perception (perceptual precision and self-referential tendency) can be derived. Data collected from a large online sample showed excellent test-retest reliability for self-referential tendency and moderate reliability for perceptual precision. Convergence validity was supported by correlations with social cognitive measures tapping into different aspects of understanding others. Hierarchical regression analyses revealed that perceptual precision and self-referential tendency explained unique variance in social cognition, suggesting that they measure unique aspects of related constructs. Overall, this study provided support for the reliability and validity of the eye contact perception metrics derived using the online Eye Contact Detection Task. The value of the task for future psychopathology research was discussed.
... From a broader perspective, this article enriches and empirically supports the previous literature on human-robot interaction(Kim et al., 2013), showing that consumers attribute human mind-like capabilities to the objects with which they interact depending on the level of human-likeness(Krach et al., 2008;Rosenthal-Von der pütten & Krämer, 2014).Furthermore, and consistent with the HVL framework, our model predicts that physical and behavioral chatbot cues influence customers' willingness to disclose personal information and purchase intentions so that customers expect to derive greater value from chatbots with greater humanness. Overall, this study contributes to the advancement of theory on chatbot-mediated service encounters through the lens of the HVL framework in several ways.First, this study contributes to the literature on nonverbal sociocommunicative dimensions(Sajjacholapunt & Ball, 2014;Senju & Johnson, 2009;Stoyanova et al., 2010;Strick et al., 2008;To & Patrick, 2021). Specifically, we clarify the effects of gaze direction on consumers' willingness to disclose and future intentions. ...
Article
Full-text available
The present research focuses on the interplay between two common features of the customer service chatbot experience: gaze direction and anthropomorphism. Although the dominant approach in marketing theory and practice is to make chatbots as human‐like as possible, the current study, built on the humanness‐value‐loyalty model, addresses the chain of effects through which chatbots' nonverbal behaviors affect customers' willingness to disclose personal information and purchase intentions. By means of two experiments that adopt a real chatbot in a simulated shopping environment (i.e., car rental and travel insurance), the present work allows us to understand how to reduce individuals' tendency to see conversational agents as less knowledgeable and empathetic compared with humans. The results show that warmth perceptions are affected by gaze direction, whereas competence perceptions are affected by anthropomorphism. Warmth and competence perceptions are found to be key drivers of consumers’ skepticism toward the chatbot, which, in turn, affects consumers’ trust toward the service provider hosting the chatbot, ultimately leading consumers to be more willing to disclose their personal information and to repatronize the e‐tailer in the future. Building on the Theory of Mind, our results show that perceiving competence from a chatbot makes individuals less skeptical as long as they feel they are good at detecting others’ ultimate intentions.
... One plausible explanation for this finding is that being ostracized increases self-referential thinking (Klauke et al., 2020;Twenge et al., 2003). Another study has shown that hearing one's own name simultaneously with the gaze cone task increases the perception of gaze as direct (Stoyanova et al., 2010). This result may simply reflect multisensory perception of gaze direction that integrates both auditory and visual information to make the judgement of the self-directedness of the gaze. ...
Article
Increased thinking about one’s self has been proposed to widen the gaze cone, that is, the range of gaze deviations that an observer judges as looking directly at them (eye contact). This study investigated the effects of a self-referential thinking manipulation and demographic factors on the gaze cone. In a preregistered experiment (N = 200), the self-referential thinking manipulation, as compared to a control manipulation, did not influence the gaze cone, or the use of first-person pronouns in a manipulation check measuring self-referential processing. This may indicate a failure of the manipulation and participants’ lack of effort. However, participants’ age was significantly correlated with both measures: older people had wider gaze cones and used more self-referring pronouns. A second experiment (N = 300) further examined the effect of the manipulation and demographic factors on self-referential processing, and the results were replicated. These findings may reflect age-related self-reference and positivity effects.
... But human beings, as soon as they perceive another "mind" (or social being) in their surroundings, engage in automatic mind-reading activities to understand the attentional focus or perspective and intentions of the other (Frischen et al., 2007;Klein et al., 2009;Redcay & Schilbach, 2019). From thereon, social signals including "facial expression, body posture, and verbal and nonverbal vocal information" (Stoyanova et al., 2010(Stoyanova et al., , p. 1765 determine to what extent observers feel that the other is mutually aware and mindful of their presence, or might even explicitly turn attention towards them (e.g., indicated by directed sensory organs). Accordingly, being addressed and feeling addressed might not constitute a binary phenomenon, but represent a continuum. ...
Chapter
Full-text available
This chapter takes a close look at the conceptualization of parasocial interaction (PSI), i.e., users' illusionary experience, during media exposure, of being in a reciprocal social interaction with a media performer (while objectively this is not the case). The chapter discusses existing conceptual challenges and boundary conditions, and proposes future research avenues. A review of PSI theory reveals that a performer's anticipated user response, and implicit forms of address have been neglected in empirical research to date. The biggest conceptual challenge to the PSI concept, however, poses the "interactivity problem." Do user interactions with online performers (influencers, streamers, etc.) and other characters in (at least partially) interactive settings still qualify as PSI? The chapter proposes that the concept can still be applied under certain conditions. PSI can be germane to interactive modalities if an individual user (a) feels like being in a reciprocal interaction with the performer; (b) feels like being directly personally addressed by the performer; and feels as if the interaction is reciprocally intimate - while it can be demonstrated that these three qualities are objectively not true.
... To examine the effect of eye gaze on voice location perception, we computed the equivalent of the "cone of direct gaze"; the range of gaze deviations a person perceives to be directed toward them (Jun et al., 2013;Mareschal et al., 2013;Stoyanova et al., 2010). For each participant, logistic functions are fitted to the proportion of "left" and "right" responses for the sound data associated with each location in the two conditions separately (direct gaze vs. eyes closed). ...
Article
Full-text available
The “ventriloquism effect” describes an illusory phenomenon where the perceived location of an auditory stimulus is pulled toward the location of a visual stimulus. Ventriloquists use this phenomenon to create an illusion where an inanimate puppet is perceived to speak. Ventriloquists use the expression and suppression of their own and the puppet’s mouth movements as well the direction of their respective eye gaze to maximize the illusion. While the puppet’s often exaggerated mouth movements have been demonstrated to enhance the ventriloquism effect, the contribution of direct eye gaze remains unknown. In Experiment 1, participants viewed an image of a person’s face while hearing a temporally synchronous recording of a voice originating from different locations on the azimuthal plane. The eyes of the facial stimuli were either looking directly at participants or were closed. Participants were more likely to misperceive the location of a range of voice locations as coming from a central position when the eye gaze of the facial stimuli were directed toward them. Thus, direct gaze enhances the ventriloquist effect by attracting participants’ perception of the voice locations toward the location of the face. In an exploratory analysis, we furthermore found no evidence for an other-race effect between White vs Asian listeners. In Experiment 2, we replicated the effect of direct eye gaze on the ventriloquism effect, also showing that faces per se attract perceived sound locations compared with audio-only sound localization. Showing a modulation of the ventriloquism effect by socially-salient eye gaze information thus adds to previous findings reporting top-down influences on this effect.
... Some studies have demonstrated that the perception of a self-directed gaze is enhanced by facial expressions indicating anger (Ewbank et al., 2009) and happiness (Lobmaier & Perrett, 2011); these expressions signal the motivation of observers to approach when accompanied by a direct gaze (Adams & Kleck, 2003). A study using multisensory stimuli reported that participants accept a wide range of averted gazes as self-directed gazes when self-relevant auditory information (i.e., one's own name) is presented simultaneously (Stoyanova et al., 2010). ...
Article
Lay abstract: The detection of a self-directed gaze is often the starting point for social interactions and a person who feels as if they are being watched can prepare to respond to others' actions irrespective of the real gaze direction because the other person may likely be motivated to approach. Although many studies demonstrated that individuals with autism spectrum disorder have difficulty discriminating gaze direction, it remains unclear how the perception of self-directed gaze by individuals with autism spectrum disorder differs from that of age-, sex-, and IQ-matched typically developing individuals. Participants observed faces with various gaze directions and answered whether the person in the photograph was looking at them or not. Individuals with and without autism spectrum disorder were just as likely to perceive subtle averted gazes as self-directed gazes. The frequency of perceiving a self-directed gaze decreased as gaze aversion increased in both groups and, in general, individuals with autism spectrum disorder showed a comparable ability to perceive a self-directed gaze as that of typically developing individuals. Interestingly, considering face membership of photographs (ingroup or outgroup faces), typically developing individuals, but not individuals with autism spectrum disorder, were more likely to perceive self-directed gazes from ingroup faces than from outgroup faces. However, individuals with autism spectrum disorder had different affective experiences in response to ingroup and outgroup faces. These results suggest that individuals with autism spectrum disorder did not show an ingroup bias for the perception of a self-directed gaze, and raise a possibility that an atypical emotional experience contributes to the diminished ingroup bias.
... In sum, this PhD thesis was the first, to the best of my knowledge, to examine how gaze direction and head orientation affected gaze perception in both the horizontal and vertical periphery. Future research could use these same paradigms and examine a multitude of other factors that have been shown to affect gaze processing, including age (Slessor et al., 2008), gender (Goodman et al., 2012), culture and ethnicity (Akechi et al., 2013;Krämer et al., 2013), introversion/extraversion (Ponari et al., 2013), neuroticism (Helminen et al., 2011), social dominance (Fromme & Beam, 1974), empathy (Deladisma et al., 2007), stress (Rimmele & Lobmaier, 2012), trait anxiety (Wieser et al., 2009), mood state (Wyland & Forgas, 2010), alcohol consumption (Penton-Voak et al., 2011), oxytocin (Domes et al., 2007), female menstrual cycle (Wolohan et al., 2013), hearing one's own name while viewing faces (Stoyanova et al., 2010), attractiveness of facial stimuli (Kampe et al., 2001), familiarity of facial stimuli (Hoehl et al., 2012), emotional expression of facial stimuli (Bindemann et al., 2008), viewing real faces versus pictures of faces (Hietanen et al., 2008), or mental disorders, such as autism , social anxiety disorder (Schneier et al., 2011), schizophrenia (Tso et al., 2012), social phobia (Gamer et al., 2011), anorexia nervosa (Cipolli et al., 1989), and prosopagnosia (Campbell et al., 1990). Thus, future research could examine gaze perception in many different sample populations, which would give greater insight into how the human brain's cognitive mechanism functions to detect and discriminate eye gaze. ...
Article
Background: Patients with schizophrenia show abnormal gaze processing, which is associated with social dysfunction. These abnormalities are associated with aberrant connectivity among brain regions associated with visual processing, social cognition, and cognitive control. In this study, we investigated 1) how effective connectivity during gaze processing is disrupted in schizophrenia, and 2) how this might contribute to social dysfunction and clinical symptoms. Methods: Thirty-nine patients with schizophrenia/schizoaffective disorder (SZ) and 33 healthy controls (HC) completed an eye gaze processing task during fMRI. Participants viewed faces with different gaze angles and performed explicit and implicit gaze processing. Four brain regions-the secondary visual cortex (Vis), posterior superior temporal sulcus (pSTS), inferior parietal lobule (IPL), and posterior medial frontal cortex (pMFC)-were identified as nodes for dynamic causal modeling analysis. Results: SZ and HC showed similar model structures for general gaze processing. Explicit gaze discrimination led to changes in effective connectivity, including stronger excitatory, bottom-up connections from Vis to pSTS and IPL and inhibitory, top-down connections from pMFC to Vis. Group differences in top-down modulation from pMFC to pSTS and IPL were noted, such that these inhibitory connections were attenuated in HC while further strengthened in SZ. Connectivity was associated with social dysfunction and symptom severity. Discussion: SZ showed notably stronger top-down inhibition during explicit gaze discrimination, which was associated with more social dysfunction but less severe symptoms among patients. Findings help pinpoint neural mechanisms of aberrant gaze processing and may serve as future targets for interventions that combine neuromodulation with social-cognitive training.
Article
Background Abnormal eye gaze perception is related to symptoms and social functioning in schizophrenia. However, little is known about the brain network mechanisms underlying these abnormalities. Here, we employed dynamic causal modeling (DCM) of fMRI data to discover aberrant effective connectivity within networks associated with eye gaze processing in schizophrenia. Methods Twenty-seven patients (schizophrenia/schizoaffective disorder, SZ) and 22 healthy controls (HC) completed an eye gaze processing task during fMRI. Participants viewed faces with different gaze angles and performed explicit gaze discrimination (Gaze: “Looking at you?” yes/no) or implicit gaze processing (Gender: “male or female?”). Four brain regions, the secondary visual cortex (Vis), posterior superior temporal sulcus (pSTS), inferior parietal lobule (IPL), and posterior medial frontal cortex (pMFC) were identified as nodes for subsequent DCM analysis. Results SZ and HC showed similar generative model structure, but SZ showed altered connectivity for specific self-connections, inter-regional connections during all gaze processing (reduced excitatory bottom-up and enhanced inhibitory top-down connections), and modulation by explicit gaze discrimination (increased frontal inhibition of visual cortex). Altered effective connectivity was significantly associated with poorer social cognition and functioning. Conclusions General gaze processing in SZ is associated with distributed cortical dysfunctions and bidirectional connectivity between regions, while explicit gaze discrimination involves predominantly top-down abnormalities in the visual system. These results suggest plausible neural mechanisms underpinning gaze processing deficits and may serve as bio-markers for intervention.
Article
Full-text available
Emotions are expressed in the voice as well as on the face. As a first step to explore the question of their integration, we used a bimodal perception situation modelled after the McGurk paradigm, in which varying degrees of discordance can be created between the affects expressed in a face and in a tone of voice. Experiment 1 showed that subjects can effectively combine information from the two sources, in that identification of the emotion in the face is biased in the direction of the simultaneously presented tone of voice. Experiment 2 showed that this effect occurs also under instructions to base the judgement exclusively on the face. Experiment 3 showed the reverse effect, a bias from the emotion in the face on judgement of the emotion in the voice. These results strongly suggest the existence of mandatory bidirectional links between affect detection structures in vision and audition.
Article
Full-text available
In shadowing one of two simultaneous messages presented dichotically, subjects are unable to report any of the content of the rejected message. Even if the rejected message consists of a short list of simple words repeated many times, a recognition test fails to reveal any trace of the list. If numbers are interpolated in prose passages presented for dichotic shadowing, no more are recalled from the rejected messages if the instructions are specifically to remember numbers than if the instructions are general: a specific set for numbers will not break through the attentional barrier set up in this task. The only stimulus so far found that will break through this barrier is the subject's own name. It is probably only material “important” to the subject that will break through the barrier.
Article
Full-text available
Previous research has demonstrated an interaction between eye gaze and selected facial emotional expressions, whereby the perception of anger and happiness is impaired when the eyes are horizontally averted within a face, but the perception of fear and sadness is enhanced under the same conditions. The current study reexamined these claims over six experiments. In the first three experiments, the categorization of happy and sad expressions (Experiments 1 and 2) and angry and fearful expressions (Experiment 3) was impaired when eye gaze was averted, in comparison to direct gaze conditions. Experiment 4 replicated these findings in a rating task, which combined all four expressions within the same design. Experiments 5 and 6 then showed that previous findings, that the perception of selected expressions is enhanced under averted gaze, are stimulus and task-bound. The results are discussed in relation to research on facial expression processing and visual attention.
Article
The Karolinska Directed Emotional Faces (KDEF; Lundqvist, Flykt, & Öhman, 1998) is a database of pictorial emotional facial expressions for use in emotion research. The original KDEF database consists of a total of 490 JPEG pictures (72x72 dots per inch) showing 70 individuals (35 women and 35 men) displaying 7 different emotional expressions (Angry, Fearful, Disgusted, Sad, Happy, Surprised, and Neutral). Each expression is viewed from 5 different angles and was recorded twice (the A and B series). All the individuals were trained amateur actors between 20 and 30 years of age. For participation in the photo session, beards, moustaches, earrings, eyeglasses, and visible make-up were exclusion criteria. All the participants were instructed to try to evoke the emotion that was to be expressed and to make the expression strong and clear. In a validation study (Goeleven et al., 2008), a series of the KDEF images were used and participants rated emotion, intensity, and arousal on 9-point Likert scales. In that same study, a test-retest reliability analysis was performed by computing the percentage similarity of emotion type ratings and by calculating the correlations for the intensity and arousal measures over a one-week period. With regard to the intensity and arousal measures, a mean correlation across all pictures of .75 and .78 respectively was found. (APA PsycTests Database Record (c) 2019 APA, all rights reserved)
Article
Abstract Children and adults were tested on a forced-choice face recognition task in which the direction of eye gaze was manipulated over the course of the initial presentation and subsequent test phase of the experiment. To establish the effects of gaze direction on the encoding process, participants were presented with to-be-studied faces displaying either direct or deviated gaze (i.e. encoding manipulation). At test, all the faces depicted persons with their eyes closed. To investigate the effects of gaze direction on the efficiency of the retrieval process, a second condition (i.e. retrieval manipulation) was run in which target faces were presented initially with eyes closed and tested with either direct or deviated gaze. The results revealed the encoding advantages enjoyed by faces with direct gaze was present for both children and adults. Faces with direct gaze were also recognized better than faces with deviated gaze at retrieval, although this effect was most pronounced for adults. Finally, the advantage for direct gaze over deviated gaze at encoding was greater than the advantage for direct gaze over deviated gaze at retrieval. We consider the theoretical implications of these findings.
Article
Among the earliest and most frequent words that infants hear are their names. Yet little is known about when infants begin to recognize their own names. Using a modified version of the head-turn preference procedure, we tested whether 4.5-month-olds preferred to listen to their own names over foils that were either matched or mismatched for stress pattern. Our findings provide the first evidence that even these young infants recognize the sound patterns of their own names. Infants demonstrated significant preferences for their own names compared with foils that shared the same stress patterns, as well as foils with opposite patterns. The results indicate when infants begin to recognize sound patterns of items frequently uttered in the infants' environments.
Article
This study investigated whether the direct gaze of others influences attentional disengagement from faces in an experimental situation. Participants were required to fixate on a centrally presented face with varying gaze directions and to detect the appearance of a peripheral target as quickly as possible. Results revealed that target detection was delayed when the preceding face was directly gazing at the subject (direct gaze), as compared with an averted gaze (averted gaze) or with closed eyes (closed eyes). This effect disappeared when a temporal gap was inserted between the offset of the centrally presented face and the onset of a peripheral target, suggesting that attentional disengagement contributed to the delayed response in the direct gaze condition. The response delay to direct gaze was not found when the contrast polarity of eyes in the facial stimuli was reversed, reinforcing the importance of gaze perception in delayed disengagement from direct gaze.
Book
In Mindblindness, Simon Baron-Cohen presents a model of the evolution and development of "mindreading." He argues that we mindread all the time, effortlessly, automatically, and mostly unconsciously. It is the natural way in which we interpret, predict, and participate in social behavior and communication. We ascribe mental states to people: states such as thoughts, desires, knowledge, and intentions. Building on many years of research, Baron-Cohen concludes that children with autism, suffer from "mindblindness" as a result of a selective impairment in mindreading. For these children, the world is essentially devoid of mental things. Baron-Cohen develops a theory that draws on data from comparative psychology, from developmental, and from neuropsychology. He argues that specific neurocognitive mechanisms have evolved that allow us to mindread, to make sense of actions, to interpret gazes as meaningful, and to decode "the language of the eyes." Bradford Books imprint
Article
2 experiments examined behavioral preferences for infant-directed (ID) speech over adult-directed (AD) speech in young infants. Using a modification of the visual-fixation-based auditory-preference procedure, Experiments 1 and 2 examined whether 12 1-month-old and 16 2-day-old infants looked longer at a visual stimulus when looking produced ID as opposed to AD speech. The results showed that both 1-month-olds and newborns preferrred ID over AD speech. Although the absolute magnitude of the ID speech preference was significantly greater, with the older infants showing longer looking durations than the younger infants, subsequent analyses showed no significant difference in the relative magnitude of this effect. Differences in overall looking times between the 2 groups apparently reflect task variables rather than differences in speech processing. These results suggest that infants' preference for the exaggerated prosodic features of ID speech is present from birth and may not depend on any specific postnatal experience. However, the possible role of prenatal auditory experience with speech is considered.