Fig 1 - uploaded by Ottmar V. Lipp
Content may be subject to copyright.
Neutral, angry, happy, and sad faces used in Experiments 1 (upper row) and 2 (rows 2 and 3). We would like to thank Dr Ekman for his permission to reproduce the images used in Experiment 2  

Neutral, angry, happy, and sad faces used in Experiments 1 (upper row) and 2 (rows 2 and 3). We would like to thank Dr Ekman for his permission to reproduce the images used in Experiment 2  

Source publication
Article
Full-text available
Detection of angry, happy and sad faces among neutral backgrounds was investigated in three single emotion tasks and an emotion comparison task using schematic (Experiment 1) and photographic faces (Experiment 2). Both experiments provided evidence for the preferential detection of anger displays over displays of other negative or positive emotions...

Similar publications

Article
Full-text available
Detection of emotional facial expressions has been shown to be more efficient than detection of neutral expressions. However, it remains unclear whether this effect is attributable to visual or emotional factors. To investigate this issue, we conducted two experiments using the visual search paradigm with photographic stimuli. We included a single...
Article
Full-text available
Recently, D.V. Becker, Anderson, Mortensen, Neufeld, and Neel (2011) proposed recommendations to avoid methodological confounds in visual search studies using emotional photographic faces. These confounds were argued to cause the frequently observed Anger Superiority Effect (ASE), the faster detection of angry than happy expressions, and conceal a...
Article
Full-text available
In a visual search task using photographs of real faces, a target emotional face was presented in an array of six neutral faces. Eye movements were monitored to assess attentional orienting and detection efficiency. Target faces with happy, surprised, and disgusted expressions were: (a) responded to more quickly and accurately, (b) localized and fi...

Citations

... The decreased attention to sad faces might indicate that excluded individuals tend to avoid signals associated with exclusion and depression (Dewall et al., 2009). Furthermore, given the potentially lesser evolutionary importance of sadness for the audience (Lipp et al., 2009), excluded individuals might allocate more cognitive resources to recognising happiness as an acceptance cue (Heerdink et al., 2015;Seidel et al., 2010) and anger as a potentially threatening signal (Heerdink et al., 2015), rather than to sad facial expressions. ...
Article
Social exclusion is an emotionally painful experience that leads to various alterations in socio-emotional processing. The perceptual and emotional consequences that may arise from experiencing social exclusion can vary depending on the paradigm used to manipulate it. Exclusion paradigms can vary in terms of the severity and duration of the leading exclusion experience, thereby classifying it as either a short-term or long-term experience. The present study aimed to study the impact of exclusion on socio-emotional processing using different paradigms that caused experiencing short-term and imagining long-term exclusion. Ambiguous facial emotions were used as socio-emotional cues. In study 1, the Ostracism Online paradigm was used to manipulate short-term exclusion. In study 2, a new sample of participants imagined long-term exclusion through the future life alone paradigm. Participants of both studies then completed a facial emotion recognition task consisting of morphed ambiguous facial emotions. By means of Point of Subjective Equivalence analyses, our results indicate that the experience of short-term exclusion hinders recognising happy facial expressions. In contrast, imagining long-term exclusion causes difficulties in recognising sad facial expressions. These findings extend the current literature, suggesting that not all social exclusion paradigms affect socio-emotional processing similarly.
... This observation is in line with previous findings of high sensitivity to detect and process negatively valenced affective information. Similar to the anger superiority effect (Anderson, 2005;Lyyra et al., 2014), we tend to be more sensitive to detect sad and fearful faces compared to neutral or happy faces (Lipp et al., 2009;Yang et al., 2007). In comparison with happiness, these negative emotional cues, especially fear, are processed more quickly in the early visual pathway of the brain Li et al., 2019). ...
Article
Full-text available
In contrast to prototypical facial expressions, we show less perceptual tolerance in perceiving vague expressions by demonstrating an interpretation bias, such as more frequent perception of anger or happiness when categorizing ambiguous expressions of angry and happy faces that are morphed in different proportions and displayed under high- or low-quality conditions. However, it remains unclear whether this interpretation bias is specific to emotion categories or reflects a general negativity versus positivity bias and whether the degree of this bias is affected by the valence or category of two morphed expressions. These questions were examined in two eye-tracking experiments by systematically manipulating expression ambiguity and image quality in fear- and sad-happiness faces (Experiment 1) and by directly comparing anger-, fear-, sadness-, and disgust-happiness expressions (Experiment 2). We found that increasing expression ambiguity and degrading image quality induced a general negativity versus positivity bias in expression categorization. The degree of negativity bias, the associated reaction time and face-viewing gaze allocation were further manipulated by different expression combinations. It seems that although we show a viewing condition-dependent bias in interpreting vague facial expressions that display valence-contradicting expressive cues, it appears that the perception of these ambiguous expressions is guided by a categorical process similar to that involved in perceiving prototypical expressions.
... Hence, our results indicate that the facial expression detection driven by attentional capture in our emotional comparison task relies on part-based rather than holistic face processing. This is consistent with previous evidence that face inversion overall slows-but does not alter-expression recognition or effects related to it, like spatial attention (Williams et al., 2005, Lipp et al., 2009. According with the literature, the lack of face inversion effect on emotion recognition stems from the detection of low-level facial features related to emotions. ...
Article
Full-text available
Recent findings on emotion comparison show a typical pattern of motor reactivity rising from attentional capture. When pairs of emotional faces are presented simultaneously, the most intense emotional face is recognized faster (Emotional Semantic Congruency—ESC effect). Furthermore, a global response speed advantage for emotional pairs with positive rather than negative average emotion intensity is observed (i.e., emotional size effect), with the choice for the happiest face resulting in a faster response than the choice for the angriest face within the pair (i.e., the happiness advantage). In two experiments, we asked whether these effects are orientation dependent, and thus linked to whether face processing is holistic or part-based. Participants were asked to choose the angriest/happiest face in emotional pairs displayed either in upright or inverted orientation and including (Experiment 1) or not including (Experiment 2) a neutral face. Beyond an overall facilitation for upright relative to inverted pairs, results showed orientation independent ESC and emotional size effects. Furthermore, the happiness advantage was present in emotional pairs of Experiment 2 but not in emotional pairs of Experiment 1, independently from face orientation. Together, results suggest that attentional capture in emotion comparison is immaterial on the type of face processing, being orientation invariant.
... However, on non-target trials, search through clean-shaven backgrounds was also faster than through bearded backgrounds. Given the design of Study 1, we cannot rule out whether faster target detection was due to faster attention to the bearded targets, faster search through the clean-shaven backgrounds or a combination of both 63,64 . ...
Article
Full-text available
Human visual systems have evolved to extract ecologically relevant information from complex scenery. In some cases, the face in the crowd visual search task demonstrates an anger superiority effect, where anger is allocated preferential attention. Across three studies (N = 419), we tested whether facial hair guides attention in visual search and influences the speed of detecting angry and happy facial expressions in large arrays of faces. In Study 1, participants were faster to search through clean-shaven crowds and detect bearded targets than to search through bearded crowds and detect clean-shaven targets. In Study 2, targets were angry and happy faces presented in neutral backgrounds. Facial hair of the target faces was also manipulated. An anger superiority effect emerged that was augmented by the presence of facial hair, which was due to the slower detection of happiness on bearded faces. In Study 3, targets were happy and angry faces presented in either bearded or clean-shaven backgrounds. Facial hair of the background faces was also systematically manipulated. A significant anger superiority effect was revealed, although this was not moderated by the target’s facial hair. Rather, the anger superiority effect was larger in clean-shaven than bearded face backgrounds. Together, results suggest that facial hair does influence detection of emotional expressions in visual search, however, rather than facilitating an anger superiority effect as a potential threat detection system, facial hair may reduce detection of happy faces within the face in the crowd paradigm.
... Few studies have investigated the process of recognizing facial expressions in cartoon faces, and the results have been inconsistent. Some results support a happiness advantage similar to that for real faces (Kirita and Endo, 1995;Leppänen and Hietanen, 2004), whereas others reveal an anger or threat advantage (Lundqvist and Öhman, 2005;Calvo et al., 2006;Lipp et al., 2009) conducted a facial inverse (upside down face) task with both schematic and real faces and found that the response time was not significantly different between upright and inverted faces, suggesting a similar feature-based process for non-real and real faces. In contrast, Rosset et al. (2008) found that a facial inversion effect with cartoon faces existed for both typically developing children and children with autism, e.g., inverting cartoon faces decreased the ability of children with autism to identify their expression, suggesting that cartoon faces are holistically processed. ...
Article
Full-text available
Cartoon faces are widely used in social media, animation production, and social robots because of their attractive ability to convey different emotional information. Despite their popular applications, the mechanisms of recognizing emotional expressions in cartoon faces are still unclear. Therefore, three experiments were conducted in this study to systematically explore a recognition process for emotional cartoon expressions (happy, sad, and neutral) and to examine the influence of key facial features (mouth, eyes, and eyebrows) on emotion recognition. Across the experiments, three presentation conditions were employed: (1) a full face; (2) individual feature only (with two other features concealed); and (3) one feature concealed with two other features presented. The cartoon face images used in this study were converted from a set of real faces acted by Chinese posers, and the observers were Chinese. The results show that happy cartoon expressions were recognized more accurately than neutral and sad expressions, which was consistent with the happiness recognition advantage revealed in real face studies. Compared with real facial expressions, sad cartoon expressions were perceived as sadder, and happy cartoon expressions were perceived as less happy, regardless of whether full-face or single facial features were viewed. For cartoon faces, the mouth was demonstrated to be a feature that is sufficient and necessary for the recognition of happiness, and the eyebrows were sufficient and necessary for the recognition of sadness. This study helps to clarify the perception mechanism underlying emotion recognition in cartoon faces and sheds some light on directions for future research on intelligent human-computer interactions.
... A widely-held view in the literature is that emotional faces capture attention in a bottom-up manner as they are more salient than neutral faces. Traditionally, it has been posited that threatening expressions hold special significance for attentional selection (Hansen & Hansen, 1988;Lipp, Price, & Tellegen, 2009a, 2009b. For instance, in Hansen and Hansen's (1988) classic face-in-the-crowd experiment, participants' reaction times (RTs) to detect an angry face did not increase with increasing set size, indicating that angry faces "popped-out" of the array (however, see Purcell, Stewart, & Skov, 1996, for an alternative account of this result). ...
Article
Research indicates that humans orient attention toward facial expressions of emotion. Orienting to facial expressions has typically been conceptualised as due to bottom-up attentional capture. However, this overlooks the contributions of top-down attention and selection history. In the present study, across four experiments, these three attentional processes were differentiated using a variation of the dot-probe task, in which participants were cued to attend to a happy or angry face on each trial. Results show that attention toward facial expressions was not exclusively driven by bottom-up attentional capture; instead, participants could shift their attention toward both happy and angry faces in a top-down manner. This effect was not found when the faces were inverted, indicating that top-down attention relies on holistic processing of the face. In addition, no evidence of selection history was found (i.e., no improvement on repeated trials or blocks of trials in which the task was to orient to the same expression). Altogether, these results suggest that humans can use top-down attentional control to rapidly orient attention to emotional faces.
... The second result is even more interesting considering this hypothesis: stimuli conveying sadness do not show an Exclusion à Task interaction effect. While all the other emotions we considered have strong social meanings and are perceived either as a reward (happiness) or a possible threat (anger and fear), the processing of sadness does not bring an evolutionary advantage or have any survival value (i.e., avoiding dangers or obtaining rewards) and, thus, is less socially relevant [63]. Considering the social value of the other emotions, they present the "social cognition impairment", while sadness is not conveying socially relevant information and thus does not show any specific advantage or disadvantage for excluded participants. ...
Article
Full-text available
Social exclusion is a painful experience that is felt as a threat to the human need to belong and can lead to increased aggressive and anti-social behaviours, and results in emotional and cognitive numbness. Excluded individuals also seem to show an automatic tuning to positivity: they tend to increase their selective attention towards social acceptance signals. Despite these effects known in the literature, the consequences of social exclusion on social information processing still need to be explored in depth. The aim of this study was to investigate the effects of social exclusion on processing two features that are strictly bound in the appraisal of the meaning of facial expressions: gaze direction and emotional expression. In two experiments (N = 60, N = 45), participants were asked to identify gaze direction or emotional expressions from facial stimuli, in which both these features were manipulated. They performed these tasks in a four-block crossed design after being socially included or excluded using the Cyberball game. Participants' empathy and self-reported emotions were recorded using the Empathy Quotient (EQ) and PANAS questionnaires. The Need Threat Scale and three additional questions were also used as manipulation checks in the second experiment. In both experiments, excluded participants showed to be less accurate than included participants in gaze direction discrimination. Modulatory effects of direct gaze (Experiment 1) and sad expression (Experiment 2) on the effects of social exclusion were found on response times (RTs) in the emotion recognition task. Specific differences in the reaction to social exclusion between males and females were also found in Experiment 2: excluded male participants tended to be less accurate and faster than included male participants, while excluded females showed a more accurate and slower performance than included female participants. No influence of social exclusion on PANAS or EQ scores was found. Results are discussed in the context of the importance of identifying gaze direction in appraisal theories.
... Walking gait is a highdimensional biological movement which potentially can express emotional states through many different partial movement patterns. Emotional states can also be identified by perceivers in a range of different biological movements, such as facial expressions (Adolphs, 2006;Ekman & Oster, 1979;Krems, Neuberg, Filip-Crawford, & Kenrick, 2015;Lipp, Price, & Tellegen, 2009), dancing (Dittrich, Troscianko, Lea, & Morgan, 1996;Walk & Homan, 1984), door knocking (Pollick, Paterson, Bruderlin, & Sanford, 2001), drinking (Pollick et al., 2001), and through dyadic non-verbal communications (Zibrek, Hoyet, Ruhland, & Mcdonnell, 2015). It could be reasonably argued that all human behaviour is partially influenced from their felt and presumably perceived emotional state. ...
Article
Perceiving emotions from gait can serve numerous socio-environmental functions (e.g. perceiving threat, sexual courting behaviours). Participant perceivers were asked to report their strategies for identifying happiness, sadness, anger and fear in point-light walkers. Perceivers claimed they identified happiness by a bouncing gait with increased arm movement, sadness by a slow slouching gait, anger by a fast stomping gait and fear by both fast and slow gaits. The emotion-specific point-light walker stimuli were kinematically analysed to verify the presence of the gait cues perceivers reported using to identify each emotion. Happy and angry walkers both displayed long strides with increased arm movement though angry strides had a faster cadence. Fearful walkers walked with fast short strides reminiscent of a scurrying gait. Sad walkers walked with slow short strides consequently creating the slowest walking pace. However, fearful and sad walkers showed less arm movement in their gait in different ways. Sad walkers moved their entire arms whilst fearful walkers primarily moved their lower arms throughout their gait.
... Lipp, Price and Tellegen (2009) looked at the detection of angry, sad and happy faces and found a bias for the detection of the angry faces [12]. Participants took part in three smaller experiments which used both schematic faces and real life photographs. ...
... Although the real life photographs still provided evidence for the quicker detection of negative emotions, the effect was not as large as with the schematic faces compared to the real life faces. It can be questioned about whether an eye tracking study, using the same stimuli as Lipp, Price and Tellegen (2009), would produce similar results [12]. ...
Article
Full-text available
The Anger Superiority Effect refers to an individual’s tendency to avoid an angry face after locating it in a crowd situation. Previous literature has used different methodologies, consisting of cartoon like facial images and real life photographs to look at this effect. The aim of the current study was to also look at the Anger Superiority Effect but in a different way to the past research. Researchers aim to use the method of eye tracking to try and provide a new line of evidence towards Anger Superiority. 20 participants (7 male and 13 female) from the area of Newcastle-upon-Tyne were asked to complete a simple emotive face memory task whilst having their eye movements tracked. Researchers wanted to find out if participants avoided angry faces after locating them, and also if there were any areas of the angry faces themselves that were of particular interest. Results demonstrated that participants did not avoid the angry face after focusing on it, contradicting the previous literature. It was also demonstrated that angry faces were fixated less quickly than other emotions, again not supporting past literature. All results were discussed in relation to the past Anger Superiority studies and improvements in relation to future research were suggested.
... Lipp, Price and Tellegen (2009) looked at the detection of angry, sad and happy faces and found a bias for the detection of the angry faces [12]. Participants took part in three smaller experiments which used both schematic faces and real life photographs. ...
... Although the real life photographs still provided evidence for the quicker detection of negative emotions, the effect was not as large as with the schematic faces compared to the real life faces. It can be questioned about whether an eye tracking study, using the same stimuli as Lipp, Price and Tellegen (2009), would produce similar results [12]. ...