Article

Gender differences in the activation of inferior frontal cortex during emotional speech perception

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

We investigated the brain regions that mediate the processing of emotional speech in men and women by presenting positive and negative words that were spoken with happy or angry prosody. Hence, emotional prosody and word valence were either congruous or incongruous. We assumed that an fRMI contrast between congruous and incongruous presentations would reveal the structures that mediate the interaction of emotional prosody and word valence. The left inferior frontal gyrus (IFG) was more strongly activated in incongruous as compared to congruous trials. This difference in IFG activity was significantly larger in women than in men. Moreover, the congruence effect was significant in women whereas it only appeared as a tendency in men. As the left IFG has been repeatedly implicated in semantic processing, these findings are taken as evidence that semantic processing in women is more susceptible to influences from emotional prosody than is semantic processing in men. Moreover, the present data suggest that the left IFG mediates increased semantic processing demands imposed by an incongruence between emotional prosody and word valence.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Past research on the processing emotional prosody has contributed to this change in how we think about the relationship between feelings and thoughts. For example, there is evidence that listeners recruit brain structures implicated in verbal processing and/or working memory when judging the emotional content of prosody [15,27]. However, even if not intentionally encoding emotional prosody, listeners easily pick up on the emotion of their interaction partners and these emotions guide verbal processing [25,24]. ...
... For example, the measurement of event-related potentials (ERPs) through scalp electrodes placed on the listener's head shows a larger negativity to positive words (e.g., success) spoken with an angry as compared to a happy voice [24]. Furthermore, functional magnetic resonance imaging (fMRI) revealed increased activity in the bilateral inferior frontal gyrus to words spoken with incongruous as compared to congruous emotional prosody [27]. ...
... While the work on vocal emotional processing clearly demonstrates a link between emotion and cognition, it also suggests that this link may vary between individuals. Specifically, there is evidence suggesting that language processing is more susceptible to influences from speaker emotion in female as compared to male listeners [25,24,27]. Furthermore, female listeners are more likely than male listeners to show a larger mismatch negativity to emotional as compared to neutral prosody if these stimuli are presented outside of attentional focus. ...
... It is evidenced that women use the prosodic cues more automatically than men, while men rely more on emotion content of lexicon [26,14,9]. Moreover, the prosodic cues and content might influence semantic processing with an additionally enhancement activity in the left inferior frontal gyrus for women [27]. ...
... There are a wealth of findings of the differences between men and women of brain activations in emotional processing. During the emotional speech processing, the activation of brain area engaged in semantic processing is affected by sex, but brain areas involved in emotional prosodic processing were not [27]. It is reported that although the common neural resource is recruited for emotional processing, women automatically engage emotion in paralinguistic decoding, while men is task performed [27]. ...
... During the emotional speech processing, the activation of brain area engaged in semantic processing is affected by sex, but brain areas involved in emotional prosodic processing were not [27]. It is reported that although the common neural resource is recruited for emotional processing, women automatically engage emotion in paralinguistic decoding, while men is task performed [27]. ...
... The background literature in the context of emotion perception in speech can be divided into two groups. One group of studies focuses on how the acoustic characteristics of voice affect the perception of emotion, examples of which include [46,25,28,49,6,39]. In these studies, the characteristics of the voice (e.g. ...
... In terms of the studies on differences in speech emotion perception between genders, most of the studies [46,25,28,49,6,39] focus on the acoustic properties and content of the vocal stimuli. [39] analyses the lateralization of word processing and emotional prosody processing between the two genders when provided with speech with emotional content. ...
... In terms of the studies on differences in speech emotion perception between genders, most of the studies [46,25,28,49,6,39] focus on the acoustic properties and content of the vocal stimuli. [39] analyses the lateralization of word processing and emotional prosody processing between the two genders when provided with speech with emotional content. ...
Preprint
Full-text available
Do men and women perceive emotions differently? Popular convictions place women as more emotionally perceptive than men. Empirical findings, however, remain inconclusive. Most prior studies focus on visual modalities. In addition, almost all of the studies are limited to experiments within controlled environments. Generalizability and scalability of these studies has not been sufficiently established. In this paper, we study the differences in perception of emotion between genders from speech data in the wild, annotated through crowdsourcing. While we limit ourselves to a single modality (i.e. speech), our framework is applicable to studies of emotion perception from all such loosely annotated data in general. Our paper addresses multiple serious challenges related to making statistically viable conclusions from crowdsourced data. Overall, the contributions of this paper are two fold: a reliable novel framework for perceptual studies from crowdsourced data; and the demonstration of statistically significant differences in speech-based emotion perception between genders.
... The background literature in the context of emotion perception in speech can be divided into two groups. One group of studies focuses on how the acoustic characteristics of voice affect the perception of emotion, examples of which include [13]- [18]. In these studies, the characteristics of the voice (e.g. ...
... In terms of the studies on differences in speech emotion perception between genders, most of the studies [13]- [18] focus on the acoustic properties and content of the vocal stimuli. [18] analyses the lateralization of word processing and emotional prosody processing between the two genders when provided with speech with emotional content. ...
... In terms of the studies on differences in speech emotion perception between genders, most of the studies [13]- [18] focus on the acoustic properties and content of the vocal stimuli. [18] analyses the lateralization of word processing and emotional prosody processing between the two genders when provided with speech with emotional content. ...
Preprint
Full-text available
Do men and women perceive emotions differently? Many evolutionary scientists and psychologists have tried to answer this question for years. Popular convictions place women as more emotionally perceptive than men. Empirical findings, however, remain inconclusive. Most prior studies focus on emotion perception among genders in the context of visual modalities. In addition, almost all of the studies are limited to experiments carried out within controlled environments. The generalizability and scalability of these studies has not been sufficiently established. In this paper, we study the differences in the perception of emotion between genders from speech, using data in the wild, annotated through crowdsourcing. While we limit ourselves to a single modality (i.e. speech), our framework is applicable to studies of emotion perception from all such loosely annotated data in general. Our paper addresses multiple serious challenges related to making statistically viable conclusions from crowdsourced data. Overall, the contributions of this paper are two fold: a reliable novel framework for perceptual studies from data in the wild; and the demonstration of statistically significant differences in speech-based emotion perception between genders.
... Multiple studies have investigated the brain regions and networks involved in processing conflict between emotional word or sentence meaning and prosody (Mitchell, 2006;Schirmer et al., 2004;Wittfoth et al., 2010). Interaction between emotional word meaning and prosody was associated with activations in bilateral inferior frontal gyrus (Schirmer et al., 2004). ...
... Multiple studies have investigated the brain regions and networks involved in processing conflict between emotional word or sentence meaning and prosody (Mitchell, 2006;Schirmer et al., 2004;Wittfoth et al., 2010). Interaction between emotional word meaning and prosody was associated with activations in bilateral inferior frontal gyrus (Schirmer et al., 2004). They also found that women were more sensitive to conflict than men. ...
... The happy prosody interfered more with the angry words in our study. This is consistent with the accuracy results obtained from an earlier study (Schirmer et al., 2004). Others have found a significant effect of word valence with better accuracy for negative words. ...
Article
Prosody processing is an important aspect of language comprehension. Previous research on emotional word-prosody conflict has shown that participants are worse when emotional prosody and word meaning are incongruent. Studies with event-related potentials have shown a congruency effect in N400 component. There has been no study on emotional processing in Hindi language in the context of conflict between emotional word meaning and prosody. We used happy and angry words spoken using happy and angry prosody. Participants had to identify whether the word had a happy or angry word meaning. The results showed a congruency effect with worse performance in incongruent trials indicating an emotional Stroop effect in Hindi. The ERP results showed that prosody information is detected very early, which can be seen in the N1 component. In addition, there was a congruency effect in N400. The results show that prosody is processed very early and emotional meaning-prosody congruency effect is obtained with Hindi. Further studies would be needed to investigate similarities and differences in cognitive control associated with language processing.
... Another area, the IFG has been reported to activate when emotional and linguistic information of the speech was not congruent (i.e. Speaker said, "you are nice" with unpleasant prosody) [16,17]. These findings indicate that the IFG may be associated with the integration process of two competing information [16]. ...
... Speaker said, "you are nice" with unpleasant prosody) [16,17]. These findings indicate that the IFG may be associated with the integration process of two competing information [16]. ...
... Regarding prosody perception, some studies found that women were more accurate than men in the recognition of vocally expressed emotions (Mill et al., 2009;Schirmer et al., 2002Schirmer et al., , 2005Sen et al., 2018), a meta-analysis of 75 studies (Hall, 1978) reported an extremely weak female advantage, and other studies failed to find any gender effects at all (Fecteau et al., 2005;Orbelo et al., 2003;Paulmann et al., 2008;Raithel & Hielscher-Fastabend, 2004). Lastly, two fMRI studies also found no gender differences on overall performance, despite different activation patterns based on gender (Schirmer et al., 2004;Wildgruber et al., 2002). To the best of our knowledge, there are no data comparing women's and men's ability to resolve sentence ambiguity via the use of prosody. ...
... Lastly, (e) although females are expected to perform better on both affective and linguistic prosody tasks compared to male participants, this difference is not expected to reach significance since most previous studies found only a slight or no female advantage (e.g. Paulmann et al., 2008;Raithel & Hielscher-Fastabend, 2004;Schirmer et al., 2004;Wildgruber et al., 2002). ...
Article
Full-text available
Investigations of affective prosodic processing have demonstrated a decline with aging. It is unclear, however, whether this decline affects all or specific emotions. Also, little is known about the ability of syntactic resolution ambiguity with the use of prosody in aging. Twenty older (age range = 70–75) and 20 younger adults (age range = 20–25) performed an affective (happiness, neutrality, sadness, surprise, fear, and anger) and a linguistic (subject/object ambiguities) prosody task. Relative to young participants, older participants faced difficulty decoding affective prosody, particularly negative emotions, and syntactic prosody, in particular the subject reading condition. A marginally positive correlation was found between the affective and syntactic prosody tasks in the group of older individuals, but no gender differences in either prosodic task. The findings of the affective prosody task are discussed under the prism of the Socioemotional Selectivity Theory, whereas general parsing strategies can account for the preference for the object reading condition.
... To fill this research gap, the present study employed functional Near-infrared Spectroscopy (fNIRS) to measure the brain signals of co-parents when they were exposed to salient infant and adult vocalisations either together (in the same room at the same time) or separately (in different rooms at different times). We decided to focus on the prefrontal cortical (PFC) region of the brain due to its integral role in both attentional regulation and social cognition, making it a likely region to be implicated when partners jointly attend to salient emotive vocalisations [23][24][25][26] . First, we aimed to investigate whether synchrony was significantly higher when couples were in the physical presence of one another compared to when they listened to vocalisations separately. ...
... In this line of reasoning, synchronous activation of the MFG indicates coupled attentional regulation between co-parenting spouses. Moreover, we observed synchronous activation in the left MFG and IFG, areas in the brain that preferentially regulate attention during processing of affective emotional information 25,26 . Besides the MFG and IFG, synchrony also emerged in the bilateral aPFC, which constitutes a component of the frontoparietal control network 24,35,36 . ...
Article
Full-text available
Co-parenting spouses who live together remain in close physical proximity to each other and regularly engage in reciprocal social interactions in joint endeavors to coordinate their caregiving. Although bi-parental rearing is a common occurrence in humans, the influence of the physical presence of a co-parenting spouse on parental brain responses remains largely unknown. Synchrony is conceptualized as the matching of behavioral and physiological signals between two individuals. In this study, we examined how the presence of a co-parenting spouse influences brain-to-brain synchrony when attending to salient infant and adult vocalizations. We hypothesized that brain-to-brain synchrony would be greater in the presence of a spousal partner. Functional Near-infrared Spectroscopy (fNIRS) was used on 24 mother-father dyads (N = 48) to measure prefrontal cortical (PFC) activities while they listened to infant and adult vocalizations in two conditions, together (in the same room at the same time) and separately (in different rooms at different times). Couples showed greater synchrony in the together condition; when comparing fNIRS data between true couples and randomly matched controls, this synchronous effect was only seen in true couples, indicating a unique effect of spousal co-regulation toward salient stimuli. Our results indicate that the physical presence of the spouse might establish synchrony in attentional regulation mechanisms toward socially relevant stimuli. This finding holds implications for the role of the co-parenting spouse in influencing social and parental brain mechanisms.
... In such cases, the emotional interference is based on incompatible response tendencies between emotional expressions and word labels (Krug and Carter 2012;Etkin et al. 2006;Bang et al. 2016;Stenberg et al. 1998;Haas et al. 2006). Similarly, another paradigm requires participants to indicate the valence of emotional prosody in the presence of congruent or incongruent semantic cues (Mitchell 2006a(Mitchell , b, 2013Rota et al. 2008;Schirmer et al. 2004;Wittfoth et al. 2010). Furthermore, in the emotion-word Flanker task, participants are required to indicate the valence (positive or negative) of a central target word while ignoring flanking emotional words mapped onto either the congruent or incongruent valence of the target (Ochsner et al. 2009;Samanez-Larkin et al. 2009). ...
... By overlapping these clusters with the seven functional networks, it was revealed that the identified areas were primarily distributed in the FPN (relative: 35.04%; absolute: Krug and Carter (2012) 42 Incongruent > congruent Facial expressions and emotional words 18.5 Krug and Carter (2012) 42 Incongruent > congruent Facial expressions and emotional words 13.39 Chechko et al. (2012) 24 Incongruent > congruent Facial expressions and emotional words 10.94 Schirmer et al. (2004) 25 Incongruent > congruent Auditory words 4.11 Bang et al. (2016) 43 Incongruent > congruent Facial expressions and emotional words 4.01 Ovaysikia et al. (2011) 10 Incongruent > congruent Facial expressions and emotional words 1.41 Godinez et al. (2016) 9 Incongruent > congruent Facial expressions and emotional words 0.85 Chechko et al. (2009) 18 Incongruent > congruent Facial expressions and emotional words 0.22 Comte et al. (2014) 33 Incongruent > congruent Facial expressions and pictures 0.16 5 Krug and Carter (2012) 42 Incongruent > congruent Facial expressions and emotional words 10.84 Chechko et al. (2014) 24 Incongruent > congruent Facial expressions and emotional words 10.12 Chechko et al. (2012) 24 Incongruent > congruent Facial expressions and emotional words 8.82 Chechko et al. (2013) 18 Incongruent > congruent Facial expressions and emotional words 7.52 Chechko et al. (2012) 24 Incongruent > congruent Facial expressions and emotional words 7.05 Ochsner et al. (2009) 16 Incongruent > congruent Emotional words 5.88 Krug and Carter (2012) 42 Incongruent > congruent Facial expressions and emotional words 4.34 Bang et al. (2016) 43 Krug and Carter (2012) 42 Incongruent > congruent Facial expressions and emotional words 1.64 Fleury et al. (2014) 12 Incongruent > congruent Facial expressions and emotional words 1.36 Rey et al. (2014) 12 Incongruent > congruent Facial expressions and emotional words 0.24 ...
Article
Full-text available
The inability to control or inhibit emotional distractors characterizes a range of psychiatric disorders. Despite the use of a variety of task paradigms to determine the mechanisms underlying the control of emotional interference, a precise characterization of the brain regions and networks that support emotional interference processing remains elusive. Here, we performed coordinate-based and functional connectivity meta-analyses to determine the brain networks underlying emotional interference. Paradigms addressing interference processing in the cognitive or emotional domain were included in the meta-analyses, particularly the Stroop, Flanker, and Simon tasks. Our results revealed a consistent involvement of the bilateral dorsal anterior cingulate cortex, anterior insula, left inferior frontal gyrus, and superior parietal lobule during emotional interference. Follow-up conjunction analyses identified correspondence in these regions between emotional and cognitive interference processing. Finally, the patterns of functional connectivity of these regions were examined using resting-state functional connectivity and meta-analytic connectivity modeling. These regions were strongly connected as a distributed system, primarily mapping onto fronto-parietal control, ventral attention, and dorsal attention networks. Together, the present findings indicate that a domain-general neural system is engaged across multiple types of interference processing and that regulating emotional and cognitive interference depends on interactions between large-scale distributed brain networks.
... Why did women do better than men in identifying the intended affect? Perhaps because, according to the studies by Schirmer & Kotz (2002), Schirmer et al. (2004), and Imaizumi et al. (2004aImaizumi et al. ( , 2004bImaizumi et al. ( , and 2005, who reported that females had an interaction of affective prosody and word meaning, whereas males had independent effects for word meaning and affective prosody, we might surmise that men process both types of affective information independently. Therefore men may have been more sensitive to the anomaly between the neutral word meaning of "banana" and the various affective meanings of "banana", and consequently did less well in identifying the intended affective meaning. ...
... Ils ont testé l'influence des discours chargés émotionnellement sur l'activité cérébrale en manipulant la prosodie des mots positifs et négatifs. Les résultats ont révélé une différence entre les deux genres dans la perception et l'expression des émotions, cela est expliqué par le fait que le traitement sémantique chez les femmes est plus sensible aux influences émotionnelles que chez les hommes (Schirmer et al., 2004). ...
Article
Full-text available
Introduction: Emotions, whether positive or negative emotions, are considered fundamental psychological states in humans. They have a central place in the modulation of cognitive activities from percept to decision. The emotional valence of words influences their understanding and their use in communication. Objective: First, this work aims to evaluate the effect of language on emotional appraisal through the variation of the emotional valence (EV) and the arousal evoked by a corpus of 123 Arabic words and 123 equivalent French words. Second, we will study the impact of age and gender on the evaluation of these two elements of the emotional experience. Method: A sample of 414 Tunisian children of school age assessed the emotional characteristics of the lexical corpus translated from French to Arabic, according to standard instructions. Results: The predicted effect of language on the emotional appraisal is confirmed, with variations between different ages and according to gender in the evaluation of EV and arousal. Thus, this study provides a lexical norm of 123 Arabic words (EmoLex123) evaluated on EV and arousal by school-aged children. Our research builds on existing studies on the influence of age and gender on the appraisal of emotional characteristics of a lexical corpus. It enriches the limited work on the interactions between language and emotion. Keywords: Activation value, Arousal, Age, Cognition, Emotion, Gender, Emotional Valence.
... Similarly, ref. [27] found gender differences in brain activation during the processing of negatively-valenced words compared to non-words in regions where higher activation was observed in the perirhinal cortex and hippocampus in females but higher activation was observed in the right supramarginal gyrus in males. Gender differences in emotional speech processing were found where a significantly higher activation in the left IFG was observed when emotional prosody and word valence was incongruous, suggesting gender differences in semantic processing [28]. With regards to emotion regulation, results from studies on explicit emotion regulation have not been consistent (see [29]). ...
Article
Full-text available
Sexism is a widespread form of gender discrimination which includes remarks based on gender stereotypes. However, little is known about the neural basis underlying the experience of sexist-related comments and how perceptions of sexism are related to these neural processes. The present study investigated whether perceptions of sexism influence neural processing of receiving sexist-related comments. Participants (N = 67) read experimental vignettes describing scenarios of comments involving gender stereotypes while near-infrared spectroscopy recordings were made to measure the hemodynamic changes in the prefrontal cortex. Results found a significant correlation between participants’ perceptions of sexism and brain activation in a brain cluster including the right dorsolateral prefrontal cortex and inferior frontal gyrus. There was a significant gender difference where female participants showed a stronger negative correlation compared to male participants. Future research can expand on these initial findings by looking at subcortical structures involved in emotional processing and gender stereotype application as well as examining cultural differences in perceptions of gender stereotypes and sexism.
... www.nature.com/scientificdata/ higher-order cognitive processes 18 , such as attentional regulation 19 , emotional perception and modulation [20][21][22] and social cognition. As these processes are integral to effective parenting (in terms of attending to and socialising the child to the external environment), the PFC thus represents a key area for investigation in parent-child interactions. ...
Article
Full-text available
The term “hyperscanning” refers to the simultaneous recording of multiple individuals’ brain activity. As a methodology, hyperscanning allows the investigation of brain-to-brain synchrony. Despite being a promising technique, there is a limited number of publicly available functional Near-infrared Spectroscopy (fNIRS) hyperscanning recordings. In this paper, we report a dataset of fNIRS recordings from the prefrontal cortical (PFC) activity of 33 mother-child dyads and 29 father-child dyads. Data was recorded while the parent-child dyads participated in an experiment with two sessions: a passive video attention task and a free play session. Dyadic metadata, parental psychological traits, behavioural annotations of the play sessions and information about the video stimuli complementing the dataset of fNIRS signals are described. The dataset presented here can be used to design, implement, and test novel fNIRS analysis techniques, new hyperscanning analysis tools, as well as investigate the PFC activity in participants of different ages when they engage in passive viewing tasks and active interactive tasks.
... Another moderating factor could be sex, although this is still a debated issue. Indeed, data in the literature show that female participants perform better than male participants in tasks assessing certain skills related to communicative-pragmatic ability, such as verbal fluency [40], emotional prosody [41,42], and social appropriateness [43]. However, other research does not support these findings and has reported that there are no structural and functional differences between the female vs. the male brain [22,44]. ...
Article
Full-text available
Background Ageing refers to the natural and physiological changes that individuals experience over the years. This process also involves modifications in terms of communicative-pragmatics, namely the ability to convey meanings in social contexts and to interact with other people using various expressive means, such as linguistic, extralinguistic and paralinguistic aspects of communication. Very few studies have provided a complete assessment of communicative-pragmatic performance in healthy ageing. Methods The aim of this study was to comprehensively assess communicative-pragmatic ability in three samples of 20 ( N = 60) healthy adults, each belonging to a different age range (20–40, 65–75, 76–86 years old) and to compare their performance in order to observe any potential changes in their ability to communicate. We also explored the potential role of education and sex on the communicative-pragmatic abilities observed. The three age groups were evaluated with a between-study design by means of the Assessment Battery for Communication (ABaCo), a validated assessment tool characterised by five scales: linguistic, extralinguistic, paralinguistic, contextual and conversational. Results The results indicated that the pragmatic ability assessed by the ABaCo is poorer in older participants when compared to the younger ones (main effect of age group: F (2,56) = 9.097; p < .001). Specifically, significant differences were detected in tasks on the extralinguistic, paralinguistic and contextual scales. Whereas the data highlighted a significant role of education ( F (1,56) = 4.713; p = .034), no sex-related differences were detected. Conclusions Our results suggest that the ageing process may also affect communicative-pragmatic ability and a comprehensive assessment of the components of such ability may help to better identify difficulties often experienced by older individuals in their daily life activities.
... 66 There is evidence supporting a sex-specific role of IFG in emotional regulation in females only. 67,68 Finally, the current findings from two cohorts suggest that sleep problems might precede mood problems in AUD females, which will need to be confirmed by future longitudinal studies. ...
Article
Full-text available
Growing evidence suggests greater vulnerability of women than men to the adverse effects of alcohol on mood and sleep. However, the underlying neurobiological mechanisms are still poorly understood. Here we examined sex difference in resting state functional connectivity in alcohol use disorder using a whole-brain data driven approach and tested for relationships with mood and self-reported sleep. To examine whether sex effects vary by severity of alcohol use disorder, we studied two cohorts: non-treatment seeking n = 141 participants with alcohol use disorder (low severity; 58 females) from the Human Connectome project, and recently detoxified n = 102 treatment seeking participants with alcohol use disorder (high severity; 34 females) at the National Institute on Alcohol Abuse and Alcoholism. For both cohorts, participants with alcohol use disorder had greater sleep and mood problems than HC, whereas sex by alcohol use effect varied by severity. Non-treatment seeking females with alcohol use disorder showed significant greater impairments in sleep but not mood compared to non-treatment seeking males with alcohol use disorder, whereas treatment-seeking females with alcohol use disorder reported greater negative mood but not sleep than treatment-seeking males with alcohol use disorder. Greater sleep problems in non-treatment seeking females with alcohol use disorder were associated with lower cerebello-parahippocampal functional connectivity, while greater mood problems in treatment-seeking females with alcohol use disorder were associated with lower fronto-occipital functional connectivity during rest. The current study suggests that changes in resting state functional connectivity may account for sleep and mood impairments in females with alcohol use disorder. The effect of severity on sex differences might reflect neuroadaptive processes with progression of alcohol use disorder and needs to be tested with longitudinal data in the future.
... gunshots), elicit increased responses in the temporal auditory cortex when compared with neutral stimulation (Belin et al. 2004;Grandjean et al. 2005;Wiethoff et al. 2008;Ethofer et al. 2012;Fruhholz et al. 2012;Arnal et al. 2015). Emotion detection in sounds is also reflected in the modulations of the event-related potential (ERP) responses, with differential effects starting ∼100-200 ms after stimulus-onset (De Baene et al. 2004;Goydke et al. 2004;Schirmer et al. 2004Schirmer et al. , 2005Sauter and Eimer 2010;Liu et al. 2012;Hung and Cheng 2014;Pannese et al. 2015), or in modulations of oscillatory neural activity in specific frequency ranges, such as in the delta, theta, beta, or gamma bands (Symons et al. 2016;Blume et al. 2017). The limbic system, including the amygdala, is believed to play a pivotal role in such amplification of sensory responses (Vuilleumier 2005;Adolphs 2008; see Pannese et al. 2015 for review). ...
Article
Full-text available
The waking brain efficiently detects emotional signals to promote survival. However, emotion detection during sleep is poorly understood, and may be influenced by individual sleep characteristics or neural reactivity. Notably, dream recall frequency has been associated with stimulus reactivity during sleep, with enhanced stimulus-driven responses in high versus low recallers. Using electroencephalography (EEG), we characterized the neural responses of healthy individuals to emotional, neutral voices and control stimuli, both during wakefulness and NREM sleep. Then, we tested how these responses varied with individual dream recall frequency. Event-related potentials (ERPs) differed for emotional versus neutral voices, both in wakefulness and NREM. Likewise, EEG arousals (sleep perturbations) increased selectively after the emotional voices, indicating emotion reactivity. Interestingly, sleep ERP amplitude and arousals after emotional voices increased linearly with participants’ dream recall frequency. Similar correlations with dream recall were observed for beta and sigma responses, but not for theta. In contrast, dream recall correlations were absent for neutral or control stimuli. Our results reveal that brain reactivity to affective salience is preserved during NREM, and is selectively associated to individual memory for dreams. Our findings also suggest that emotion-specific reactivity during sleep, and not generalized alertness, may contribute to the encoding/retrieval of dreams.
... The probability they constitute a dysfunctional coping mechanism as they probably constitute in the general population is strong also here (Fountoulakis et al., 2020a, Freyler et al., 2019, Tomljenovic et al., 2020. This probability of an affective component in the frame of a dysfunctional copying mechanism is in accord with the finding that believing was more frequent in females and could be explained through higher temperamental levels of anxiety and harm avoidance (Sacher et al., 2013, Aleman and Swart, 2008, Fischer et al., 2004, Lee et al., 2005, Lee et al., 2002, McClure et al., 2004, Schirmer et al., 2004, Schroeder et al., 2004, Fusar-Poli et al., 2009). ...
Article
Introduction The aim of the study was to investigate mental health and conspiracy theory beliefs concerning COVID-19 among Health Care Professionals (HCPs). Material and Methods During lockdown, an online questionnaire gathered data from 507 HCPs (432 females aged 33.86±8.63 and 75 males aged 39.09±9.54). Statistical Analysis A post-stratification method to transform the study sample was used; descriptive statistics were calculated. Results Anxiety and probable depression were increased 1.5-2-fold and were higher in females and nurses. Previous history of depression was the main risk factor. The rates of the believing in conspiracy theories concerning the COVID-19 were alarming with the majority of individuals (especially females) following some theory to at least some extend. Conclusions The current paper reports high rates of depression, distress and suicidal thoughts in the HCPs during the lockdown, with a high prevalence of beliefs in conspiracy theories. Female gender and previous history of depression acted as risk factors while the belief in conspiracy theories might act as a protective factor. The results should be considered with caution due to the nature of the data (online survey on a self-selected but stratified sample)
... words denoting emotions). Interestingly, neuroimaging research has also implicated that semantic processing in men is not as susceptible to influences from emotional prosody as is semantic processing in women in emotional speech perception [34]. Emotional semantics has been reported to elicit bilateral activation of inferior frontal gyrus in women but only left hemisphere activation in men. ...
... Furthermore, the results of a neuroimaging study by Schirmer et al. (2004) on emotional speech processing suggest that male listeners are less susceptible to intonation manipulations as evidenced by reduced IFG (inferior frontal gyrus) contrast activations for incongruent versus congruent speech presentation. ...
Article
Focus highlights the fact that contextual alternatives are relevant for the interpretation of an utterance. For example, if someonesays:“The meeting is on TUESDAY,”with focus marked by a pitch accent on“Tuesday,”the speaker might want to correct theassumption that the meeting is on Monday (an alternative date). Intonation as one way to signal focus was manipulated in adelayed-recall paradigm. Recall of contextual alternatives was tested in a condition where a set of alternatives was evoked bycontrastive intonation. A control condition used intonation contours reported for broad focus in German. It was hypothesized thatcontrastive intonation improves recall, just as focus-sensitive particles (e.g.,‘only’) do, compared to sentences without suchparticles. Participants listened to short texts introducing a list of three elements from taxonomic categories. One of the threeelements was re-mentioned in a subsequent critical sentence, realized with either a broad (H+!H*) or with a contrastive intonationcontour (L+H*). Cued recall of the list elements was tested block-wise. Results show that contrastive intonation enhances recallfor focus alternatives. In addition, it was found that the observed recall benefit is predominantly driven by females. The resultssupport the assumption that contextual alternatives are better encoded in memory irrespective of whether focus is expressedprosodically or by a focus-sensitive particle. The results further show that females are more sensitive to pragmatic informationconveyed through prosody than males.
... Furthermore, the results of a neuroimaging study by Schirmer et al. (2004) on emotional speech processing suggest that male listeners are less susceptible to intonation manipulations as evidenced by reduced IFG (inferior frontal gyrus) contrast activations for incongruent versus congruent speech presentation. ...
Article
Full-text available
Focus highlights the fact that contextual alternatives are relevant for the interpretation of an utterance. For example, if someone says: “The meeting is on TUESDAY,” with focus marked by a pitch accent on “Tuesday,” the speaker might want to correct the assumption that the meeting is on Monday (an alternative date). Intonation as one way to signal focus was manipulated in a delayed-recall paradigm. Recall of contextual alternatives was tested in a condition where a set of alternatives was evoked by contrastive intonation. A control condition used intonation contours reported for broad focus in German. It was hypothesized that contrastive intonation improves recall, just as focus-sensitive particles (e.g., ‘only’) do, compared to sentences without such particles. Participants listened to short texts introducing a list of three elements from taxonomic categories. One of the three elements was re-mentioned in a subsequent critical sentence, realized with either a broad (H+!H*) or with a contrastive intonation contour (L+H*). Cued recall of the list elements was tested block-wise. Results show that contrastive intonation enhances recall for focus alternatives. In addition, it was found that the observed recall benefit is predominantly driven by females. The results support the assumption that contextual alternatives are better encoded in memory irrespective of whether focus is expressed prosodically or by a focus-sensitive particle. The results further show that females are more sensitive to pragmatic information conveyed through prosody than males.
... Audiovisual integration of emotional signals was observed within the first 200 ms or less (Jessen and Kotz, 2011b;Kokinous et al., 2015), and involves areas in the right ST (Awwad Shiekh Hasan et al., 2016;Hagan et al., 2013;Young, 2016), both in pST (Kreifelts et al., 2007;Perrodin et al., 2015;Watson et al., 2014) and more aST regions (Gainotti et al., 2008;Perrodin et al., 2015), and the amygdala (Milesi et al., 2014). Emotional nonverbal voice signal information specifically can influence the way we visually scan emotional faces , but also the way we auditorily perceive verbal emotional information (Schirmer et al., 2004). Integrating vocal and facial signal information also concerns more complex socio-affective information, such as attractiveness (Groyecka et al., 2017), dominance and trustworthiness (Mileva et al., 2018). ...
Article
Full-text available
While humans have developed a sophisticated and unique system of verbal auditory communication, they also share a more common and evolutionarily important nonverbal channel of voice signaling with many other mammalian and vertebrate species. This nonverbal communication is mediated and modulated by the acoustic properties of a voice signal, and is a powerful – yet often neglected – means of sending and perceiving socially relevant information. From the viewpoint of dyadic (involving a sender and a signal receiver) voice signal communication, we discuss the integrated neural dynamics in primate nonverbal voice signal production and perception. Most previous neurobiological models of voice communication modelled these neural dynamics from the limited perspective of either voice production or perception, largely disregarding the neural and cognitive commonalities of both functions. Taking a dyadic perspective on nonverbal communication, however, it turns out that the neural systems for voice production and perception are surprisingly similar. Based on the interdependence of both production and perception functions in communication, we first propose a re-grouping of the neural mechanisms of communication into auditory, limbic, and paramotor systems, with special consideration for a subsidiary basal-ganglia-centered system. Second, we propose that the similarity in the neural systems involved in voice signal production and perception is the result of the co-evolution of nonverbal voice production and perception systems promoted by their strong interdependence in dyadic interactions.
... Ils ont testé l'influence des discours chargés émotionnellement sur l'activité cérébrale en manipulant la prosodie des mots positifs et négatifs. Les résultats ont révélé une différence entre les deux genres dans la perception et l'expression des émotions, cela est expliqué par le fait que le traitement sémantique chez les femmes est plus sensible aux influences émotionnelles que chez les hommes (Schirmer, Zysset, Kotz, & von Cramon, 2004). Syssau & Monnier (2009), quant à elles, ont signalé la différence des genres dans la perception de la valence émotionnelle et de la valeur d'excitation chez les enfants. ...
Preprint
Full-text available
Accepted paper for publication in Revue Tunisienne de Sciences Sociales Introduction : Les émotions, qu’elles soient positives ou négatives, sont considérées comme des états psychologiques fondamentaux chez l’être humain. Elles occupent une place centrale dans la modulation de l’activité cognitive allant du percept à la décision. La valence émotionnelle des mots influent sur leur compréhension et leur utilisation en communication. Objectif : Ce travail vise à évaluer d’une part la variation de la valence émotionnelle (VE) et de la valeur d’excitation (arousal) suscités par un corpus de 123 mots arabes et 123 mots français équivalents, et d’autre part l’impact de l’âge et du genre sur l’évaluation de ces deux éléments constitutifs de l’expérience émotionnelle. Méthode : Un échantillon de 414 enfants tunisiens d’âge scolaire a évalué les caractéristiques émotionnelles de ce corpus lexical traduit de la langue française à la langue arabe, selon des instructions standards. Résultats : L’effet prévu de la langue sur l’évaluation émotionnelle est confirmé, avec les variations entre les différents âges et les différents genres dans le jugement de la VE et de la valeur d’excitation. Cette étude fournit ainsi une norme lexicale de 123 mots arabes (EmoLex123) évalués sur la VE et l’arousal par des enfants scolarisés. Notre recherche s’ajoute aux études existantes sur le sujet de l’influence de l’âge et du genre sur les caractéristiques émotionnelles d’un corpus lexical et enrichit les travaux peu nombreux sur les interactions entre langue et émotion. Mots clés : Cognition, Emotion, Valence émotionnelle, Valeur d’excitation, Arousal, âge, genre
... Some studies suggest that women could have an advantage in processing pragmatic information conveyed by prosody. Gender differences in the interpretation of intonation have been examined most frequently in investigations on emotional speech processing (e.g., Hung & Cheng, 2014;Schirmer et al., 2002;Schirmer, Striano, & Friederici, 2005;Schirmer, Zysset, Kotz, & von Cramon, 2004;Wildgruber et al., 2002). More closely related to our experiment are findings of gender differences obtained from studies on the interpretation or production of coherent narratives (e.g., Frank, Baron-Cohen, & Ganzel, 2015;Kaiser, Kuenzli, Zappatore, & Nitsch, 2007;Kansaku, Yamaura, & Kitazawa, 2000). ...
Article
Full-text available
In tonal languages, the role of intonation in information-structuring has yet to be fully investigated. Intuitively, one would expect intonation to play only a small role in expressing communicative functions. However, experimental studies with Vietnamese native speakers show that intonation contours vary across different contexts and are used to mark certain types of information, for example, focus (Jannedy, 2007). In non-tonal languages (e.g., English), the marking of focus by intonation can influence the processing of focus alternatives (Fraundorf, Watson, & Benjamin, 2010). If Vietnamese also uses intonation to mark focus, the question arises whether the behavioral consequences of prosodic focus marking in Vietnamese are comparable to languages such as English or German. To test this, we replicate a study on memory for focus alternatives, originally carried out in German (Koch & Spalek, in progress), with Vietnamese language stimuli. In the original study, memory for focus alternatives was improved in a delayed recall task for focused elements produced with contrastive intonation in female speakers. Here, we replicate this finding with Northern Vietnamese native speakers: Contrastive intonation seems to improve later recall for focus alternatives in Northern Vietnamese, but only for female participants, in line with the findings by Koch and Spalek (in progress). These results indicate that prosodic focus marking in Vietnamese makes alternatives to the focused element more salient.
... Some studies suggest that women could have an advantage in processing pragmatic information conveyed by prosody. Gender differences in the interpretation of intonation have been examined most frequently in investigations on emotional speech processing (e.g., Hung & Cheng, 2014;Schirmer et al., 2002;Schirmer, Striano, & Friederici, 2005;Schirmer, Zysset, Kotz, & von Cramon, 2004;Wildgruber et al., 2002). More closely related to our experiment are findings of gender differences obtained from studies on the interpretation or production of coherent narratives (e.g., Frank, Baron-Cohen, & Ganzel, 2015;Kaiser, Kuenzli, Zappatore, & Nitsch, 2007;Kansaku, Yamaura, & Kitazawa, 2000). ...
Article
Full-text available
In tonal languages, the role of intonation in information-structuring has yet to be fully investigated. Intuitively, one would expect intonation to play only a small role in expressing communicative functions. However, experimental studies with Vietnamese native speakers show that intonation contours vary across different contexts and are used to mark certain types of information, for example, focus (Jannedy, 2007). In non-tonal languages (e.g., English), the marking of focus by intonation can influence the processing of focus alternatives (Fraundorf, Watson, & Benjamin, 2010). If Vietnamese also uses intonation to mark focus, the question arises whether the behavioral consequences of prosodic focus marking in Vietnamese are comparable to languages such as English or German. To test this, we replicate a study on memory for focus alternatives, originally carried out in German (Koch & Spalek, in progress), with Vietnamese language stimuli. In the original study, memory for focus alternatives was improved in a delayed recall task for focused elements produced with contrastive intonation in female speakers. Here, we replicate this finding with Northern Vietnamese native speakers: Contrastive intonation seems to improve later recall for focus alternatives in Northern Vietnamese, but only for female participants, in line with the findings by Koch and Spalek (in progress). These results indicate that prosodic focus marking in Vietnamese makes alternatives to the focused element more salient.
... Second, this study did not investigate sex-based differences in participants. Past studies have attested to sex differences in processing of emotional speech and emotional prosody of vocalisations that are reflected in the brain [46,47]. Moreover, across both auditory and visual modalities, women have consistently been shown to be more adept at emotional recognition compared to men [48][49][50][51]. ...
Article
Full-text available
The social context in which a salient human vocalisation is heard shapes the affective information it conveys. However, few studies have investigated how visual contextual cues lead to differential processing of such vocalisations. The prefrontal cortex (PFC) is implicated in processing of contextual information and evaluation of saliency of vocalisations. Using functional Near-Infrared Spectroscopy (fNIRS), we investigated PFC responses of young adults (N = 18) to emotive infant and adult vocalisations while they passively viewed the scenes of two categories of environmental contexts: a domestic environment (DE) and an outdoors environment (OE). Compared to a home setting (DE) which is associated with a fixed mental representation (e.g., expect seeing a living room in a typical house), the outdoor setting (OE) is more variable and less predictable, thus might demand greater processing effort. From our previous study in Azhari et al. (2018) that employed the same experimental paradigm, the OE context was found to elicit greater physiological arousal compared to the DE context. Similarly, we hypothesised that greater PFC activation will be observed when salient vocalisations are paired with the OE compared to the DE condition. Our finding supported this hypothesis: the left rostrolateral PFC, an area of the brain that facilitates relational integration, exhibited greater activation in the OE than DE condition which suggests that greater cognitive resources are required to process outdoor situational information together with salient vocalisations. The result from this study bears relevance in deepening our understanding of how contextual information differentially modulates the processing of salient vocalisations.
... Similar to studies of affective congruence for discrete emotions using prosody and content in spoken sentences, we have identified inferior frontal gyrus (Wittfoth et al., 2009;Mitchell, 2006;Schirmer, Zysset, Kotz, & von Cramon, 2004). Previous studies have suggested that inferior frontal cortex is associated with domain-general conflict resolution and cognitive control processes (Novick, Trueswell, & Thompson-Schill, 2010;Nelson, Reuter-Lorenz, Persson, Sylvester, & Jonides, 2009;Derrfuss, Brass, Neumann, & von Cramon, 2005;Novick, Trueswell, & Thompson-Schill, 2005). ...
Article
Full-text available
Evaluating multisensory emotional content is a part of normal day-to-day interactions. We used fMRI to examine brain areas sensitive to congruence of audiovisual valence and their overlap with areas sensitive to valence. Twenty-one participants watched audiovisual clips with either congruent or incongruent valence across visual and auditory modalities. We showed that affective congruence versus incongruence across visual and auditory modalities is identifiable on a trial-by-trial basis across participants. Representations of affective congruence were widely distributed with some overlap with the areas sensitive to valence. Regions of overlap included bilateral superior temporal cortex and right pregenual anterior cingulate. The overlap between the regions identified here and in the emotion congruence literature lends support to the idea that valence may be a key determinant of affective congruence processing across a variety of discrete emotions.
... Structural brain imaging studies in healthy adults have revealed larger gray matter volumes in the left IFG (opercularis and triangularis) in females compared to males (Im et al., 2006;Ruigrok et al., 2014;Kurth et al., 2017). Functional brain imaging studies in healthy adults have also indicated that females show greater activation in the IFG than males during emotional speech perception (Schirmer et al., 2004) and mental rotation (Hugdahl et al., 2006) tasks. Our findings are consistent with these previous reports suggesting sex differences in the function of the left IFG. ...
Article
Full-text available
Differences between males and females in brain development and in the organization and hemispheric lateralization of brain functions have been described, including in language. Sex differences in language organization may have important implications for language mapping performed to assess, and minimize neurosurgical risk to, language function. This study examined the effect of sex on the activation and functional connectivity of the brain, measured with presurgical functional magnetic resonance imaging (fMRI) language mapping in patients with a brain tumor. We carried out a retrospective analysis of data from neurosurgical patients treated at our institution who met the criteria of pathological diagnosis (malignant brain tumor), tumor location (left hemisphere), and fMRI paradigms [sentence completion (SC); antonym generation (AG); and resting-state fMRI (rs-fMRI)]. Forty-seven patients (22 females, mean age = 56.0 years) were included in the study. Across the SC and AG tasks, females relative to males showed greater activation in limited areas, including the left inferior frontal gyrus classically associated with language. In contrast, males relative to females showed greater activation in extended areas beyond the classic language network, including the supplementary motor area (SMA) and precentral gyrus. The rs-fMRI functional connectivity of the left SMA in the females was stronger with inferior temporal pole (TP) areas, and in the males with several midline areas. The findings are overall consistent with theories of greater reliance on specialized language areas in females relative to males, and generalized brain areas in males relative to females, for language function. Importantly, the findings suggest that sex could affect fMRI language mapping. Thus, considering sex as a variable in presurgical language mapping merits further investigation.
... Ricoeur, 2003), and how basic demographics affect emotional speech perception (e.g. Paulmann et al., 2008;Schirmer et al., 2004). By contrast, there has been comparatively little empirical research examining why emotional reactions to music vary across situations and people. ...
Conference Paper
Full-text available
The capacity of listeners to perceive or experience emotions in response to music depends on many factors including dispositional traits, empathy, and musical enculturation. Emotional responses are also known to be mediated by pharmacological factors, including both legal and illegal drugs. Existing research has established that acetaminophen, a common over-the-counter pain medication, blunts emotional responses (e.g., Durso, Luttrell, & Way, 2015). The current study extends this research by examining possible effects of acetaminophen on both perceived and felt responses to emotionally-charged sound stimuli. Additionally, it tests whether acetaminophen effects are specific for particular emotions (e.g. sadness, fear) or whether acetaminophen blunts emotional responses in general. The experiment employs a randomized, double-blind, parallel-group, placebo-controlled design. Participants are randomly assigned to ingest acetaminophen or a placebo. Then, they are asked to complete two experimental blocks regarding musical and non-musical sounds. The first block asks participants to judge the extent to which a sound conveys a certain affect (on a Likert scale). The second block aims to examine a listener's emotional responses to sound stimuli. The study is currently in progress; here, preliminary results are reported for 19 participants of a planned 200 cohort. In light of the fact that some 50 million Americans take acetaminophen each week, if the final results prove consistent with existing research on the emotional blunting of acetaminophen, this suggests that future studies in music and emotion might consider controlling for the pharmacological state of participants.
... The voice is likely the most important sound category in a social environment. It carries not only verbal information, but also socially relevant cues about the speaker, such as his/ her identity, sex, age, and emotional state (Belin et al. 2004;Schirmer et al. 2004). Vocal emotions can be communicated either through suprasegmental modulations of speech prosody (Schirmer et al. 2005) or short nonverbal vocalizations (Schröder 2003), also known as affective bursts (Belin et al. 2008). ...
Article
Full-text available
How much acoustic signal is enough for an accurate recognition of nonverbal emotional vocalizations? Using a gating paradigm (7 gates from 100 to 700 ms), the current study probed the effect of stimulus duration on recognition accuracy of emotional vocalizations expressing anger, disgust, fear, amusement, sadness and neutral states. Participants (n = 52) judged the emotional meaning of vocalizations presented at each gate. Increased recognition accuracy was observed from gates 2 to 3 for all types of vocalizations. Neutral vocalizations were identified with the shortest amount of acoustic information relative to all other types of vocalizations. A shorter acoustic signal was required to decode amusement compared to fear, anger and sadness, whereas anger and fear required equivalent amounts of acoustic information to be accurately recognized. These findings confirm that the time course of successful recognition of discrete vocal emotions varies by emotion type. Compared to prior studies, they additionally indicate that the type of auditory signal (speech prosody vs. nonverbal vocalizations) determines how quickly listeners recognize emotions from a speaker’s voice.
... The electrophysiological effect of gender on N2ac ERPs is consistent with prior studies showing that female participants were more sensitive to vocal emotions than male participants when vocal emotions were task-irrelevant (for instance Schirmer et al., 2013;Schirmer, Kotz, & Friederici, 2002;Schirmer, Striano, & Friederici, 2005;Schirmer, Zysset, Kotz, & von Cramon, 2004). However, the current study was not designed to address gender differences, which explains why there were fewer male than female participants (Nmale= 13 vs. ...
Article
Full-text available
Salient vocalizations, especially aggressive voices, are believed to attract attention due to an automatic threat detection system. However, studies assessing the temporal dynamics of auditory spatial attention to aggressive voices are missing. Using event-related potential markers of auditory spatial attention (N2ac and LPCpc), we show that attentional processing of threatening vocal signals is enhanced at two different stages of auditory processing. As early as 200 ms post stimulus onset, attentional orienting/engagement is enhanced for threatening as compared to happy vocal signals. Subsequently, as early as 400 ms post stimulus onset, the reorienting of auditory attention to the center of the screen (or disengagement from the target) is enhanced. This latter effect is consistent with the need to optimize perception by balancing the intake of stimulation from left and right auditory space. Our results extend the scope of theories from the visual to the auditory modality by showing that threatening stimuli also bias early spatial attention in the auditory modality. Attentional enhancement was only present in female and not in male participants.
... During implicit tasks, sex differences are more robust. A greater female sensitivity to emotional expressions has been established with a range of paradigms (Proverbio, Adorni, Zani, & Trestianu, 2009;Proverbio & Galli, 2016;Proverbio, Zani, & Adorni, 2008) including behavioral interference tasks (Schirmer, Chen, Ching, Tan, & Hong, 2013;Schirmer, Seow, & Penney, 2013;Schirmer, Zysset, Kotz, & Yves von Cramon, 2004) as well as passive change detection assessed with online measures of brain activity (Fan, Hsu, & Cheng, 2013;Schirmer, Striano, & Friederici, 2005). Additionally and of particular relevance here, eye tracking of passive gazing at social interactions indicates that women are more likely than men to fixate on instances of touch suggesting that such touch is more readily noticed and perhaps more salient to them (Schirmer, Ng, & Ebstein, 2018). ...
Article
Observers can simulate aspects of other people's tactile experiences. We asked whether they do so when faced with full-body social interactions, whether emerging representations go beyond basic sensorimotor mirroring, and whether they depend on processing goals and inclinations. In an EEG/ERP study, we presented line-drawn, dyadic interactions with and without affectionate touch. In an explicit and an implicit task, participants categorized images into touch versus no-touch and same versus opposite sex interactions, respectively. Modulations of central Rolandic rhythms implied that affectionate touch displays engaged sensorimotor mechanisms. Additionally, the late positive potential (LPP) being larger for images with as compared to without touch pointed to an involvement of higher order socio-affective mechanisms. Task and sex modulated touch perception. Sensorimotor responding, indexed by Rolandic rhythms, was fairly independent of the task but appeared less effortful in women than in men. Touch induced socio-affective responding, indexed by the LPP, declined from explicit to implicit processing in women and disappeared in men. In sum, this study provides first evidence that vicarious touch from full-body social interactions entails shared sensorimotor as well as socio-affective experiences. Yet, mental representations of touch at a socio-affective level are more likely when touch is goal relevant and observers are female. Together, these results outline the conditions under which touch in visual media may be usefully employed to socially engage observers.
... functional activations (Blanton et al., 2004;Schirmer et al., 2004;Hofer et al., 2006;Im et al., 2006a;Koch et al., 2007;Wang et al., 2007;Schulte-Rüther et al., 2008;Brun et al., 2009;Li et al., 2014d;Meng et al., 2014). Specifically, the cortical volume of IFG, the cortical thickness of IFG and STG, and the surface areas of auditory structure (parts of the STG) and cingulate region, are larger in females than in males, which might be related to the general higher language skills in females (Blanton et al., 2004;Im et al., 2006a;Brun et al., 2009). ...
Article
Full-text available
The highly convoluted cortical folding of the human brain is intriguingly complex and variable across individuals. Exploring the underlying representative patterns of cortical folding is of great importance for many neuroimaging studies. At term birth, all major cortical folds are established and are minimally affected by the complicated postnatal environments; hence, neonates are the ideal candidates for exploring early postnatal cortical folding patterns, which yet remain largely unexplored. In this paper, we propose a novel method for exploring the representative regional folding patterns of infant brains. Specifically, first, multi-view curvature features are constructed to comprehensively characterize the complex characteristics of cortical folding. Second, for each view of curvature features, a similarity matrix is computed to measure the similarity of cortical folding in a specific region between any pair of subjects. Next, a similarity network fusion method is adopted to nonlinearly and adaptively fuse all the similarity matrices into a single one for retaining both shared and complementary similarity information of the multiple characteristics of cortical folding. Finally, based on the fused similarity matrix and a hierarchical affinity propagation clustering approach, all subjects are automatically grouped into several clusters to obtain the representative folding patterns. To show the applications, we have applied the proposed method to a large-scale dataset with 595 normal neonates and discovered representative folding patterns in several cortical regions, i.e., the superior temporal gyrus (STG), inferior frontal gyrus (IFG), precuneus, and cingulate cortex. Meanwhile, we have revealed sex difference in STG, IFG, and cingulate cortex, as well as hemispheric asymmetries in STG and cingulate cortex in terms of cortical folding patterns. Moreover, we have also validated the proposed method on a public adult dataset, i.e., the Human Connectome Project (HCP), and revealed that certain major cortical folding patterns of adults are largely established at term birth.
... Briefly, an example of a perceptual sex difference is an olfaction study finding that women showed greater activation than men in the left orbitofrontal cortex when presented with pleasant or unpleasant stimuli (Royet, Plailly, Delon-Martin, Kareken, & Segebarth, 2003). An example of a perceptual sex difference that also relates to psychosocial functioning is the finding from speech processing research that women recruited the inferior frontal gyri to a greater extent than did men in response to hearing emotional prosody that was incongruous with semantic content (Schirmer, Zysset, Kotz, & Yves von Cramon, 2004). In terms of cognitive sex differences, executive and visuospatial functioning appear to be lateralized in opposite hemispheres for men and women: Based on findings of impaired capacity for activities of daily living associated with lesions of the ventromedial prefrontal cortex and amygdala, Koscik, Bechara, and Tranel (2010) concluded executive function in healthy individuals to be right-lateralized in men and left-lateralized in women. ...
Thesis
Full-text available
Current developmental models of gender identity and gender dysphoria (GD) lack sex-specific profiles of brain function that differentiate between typically-developing and cross-gender identified youth, as postulated by models like the unified theory of the origins of sex differences (Arnold, 2009) and the neurobiological theory of the origins of transsexuality (Swaab & Garcia-Falgueras, 2009). Previously, investigators have used brain imaging modalities such as Resting-State functional Magnetic Resonance Imaging (R-fMRI) to demonstrate differences in resting-state functional connectivity (RSFC) between typically-developing male and female youth, and between typically-developing and GID-diagnosed youth. In the present pilot study, I used R-fMRI to investigate differences in RSFC between typically-developing and cross-gender identified male and female youth subgroups, with the hypothesis that GID-diagnosed subgroups would demonstrate connectivity patterns in between those of typically-developing males and females. Eleven youth diagnosed with gender identity disorder (four males, ages 9 to 20 years; seven females, ages 12 to 20 years) were matched on age and assigned gender with 11 typically-developing youth. All participants completed written informed consent to undergo the IRB-approved research procedures. R-fMRI were collected while the participants were lying down and resting, with their eyes closed. Primary analyses focused on 14 brain regions selected because they showed sex differences most frequently or reliably in previous studies of R-fMRI in typically-developing youth. Statistical analysis used a 2 x 2 mixed effects analysis (assigned female versus assigned male x typically-developing versus GID-diagnosed), with-individual level connectivity maps as the dependent variable. Results showed that significant interaction effects of functional connectivity patterns were associated with 6 of the 14 selected brain regions. GID-diagnosed assigned females exhibited connectivity patterns similar to those of typically-developing males associated with the right medial superior frontal gyrus, right supplementary motor area, left lingual gyrus, right lingual gyrus, left middle frontal gyrus, left medial superior frontal gyrus, left cuneus, right thalamus, left dorsolateral superior frontal gyrus, and left inferior frontal gyrus, triangular part. GID-diagnosed assigned males exhibited functional connectivity patterns similar to those of typically-developing females associated with the right medial superior frontal gyrus and right supplementary motor area; in between those of typically-developing females and males associated with left lingual gyrus, right lingual gyrus, left middle frontal gyrus, left medial superior frontal gyrus, right medial superior frontal gyrus, left dorsolateral superior frontal gyrus, and left inferior frontal gyrus, triangular part; and similar to typically-developing males associated with the right lingual gyrus and left middle frontal gyrus. The right precuneus, hypothesized to show robust findings, did not reveal any effects. In the current study, GID-diagnosed assigned males tended toward demasculinized effects (quantitative interactions showing differences of magnitude), whereas GID-diagnosed assigned females tended toward masculinized effects (qualitative interactions showing differences in direction of correlation). The current findings support the view that brain development associated with gender dysphoria proceeds along separate but overlapping sex-related regions for GID-diagnosed assigned females and males and provide further evidence of greater cross-gender brain differentiation in assigned females at an earlier age than in assigned males (possibly due to earlier onset of puberty in females). These data suggest that any future use of patterns of brain function for diagnosing gender dysphoria may require separate criteria (e.g., different sets of brain regions) for assigned males and assigned females but will require replication on larger samples.
... This finding was in accordance with the neuroanatomical findings that while males tend to be rational by recruiting bilateral prefrontal regions, females tend to be emotional by recruiting bilateral amygdala when facing with emotional information (AlRyalat, 2017;Filkowski et al., 2017). Actually, females have long been believed to outperform males at recognizing emotions expressions (McClure, 2000;Li et al., 2008;Yuan et al., 2009;Donges et al., 2012;Erol et al., 2013;Lee et al., 2013;Weisenbach et al., 2014;Mason et al., 2016), and more prone to be influenced by emotional information (Schirmer et al., 2002(Schirmer et al., , 2004Kim and Son, 2015). The current conspicuous female advantage of emotion decoding during both early stage of motivational salience monitoring (RewP) and late stage of cognitive appraisal processing (feedback P300) was in line with the findings that gender difference in emotions processing emerges at early stage of emotion extraction (Lee et al., 2017) and late stage of emotion in-depth processing (Orozco and Ehlers, 1998). ...
Article
Full-text available
It is widely believed that females outperformed males in emotional information processing. The present study tested whether the female superiority in emotional information processing exists in a naturalistic social-emotional context, if so, what the temporal dynamics underlies. The behavioral and electrophysiological responses were recorded while participants were performing an interpersonal gambling game with opponents’ facial emotions given as feedback. The results yielded that emotional cues modulated the influence of monetary feedback on outcome valuation. Critically, this modulation was more conspicuous in females: opponents’ angry expressions increased females’ risky tendency and decreased the amplitude of reward positivity (RewP) and feedback P300. These findings indicate that females are more sensitive to emotional expressions in real interpersonal interactions, which is manifested in both early motivational salience detection and late conscious cognitive appraisal stages of feedback processing.
... Supplementary analyses revealed an interaction between gender and vocal signal on the N2ac, with a larger N2ac in female than in male participants, but not the LPCpc (see Supplementary Material). Consistently, prior studies found that female participants were more sensitive to aggressive signals than male participants were (for instance Schirmer, Chen, Ching, Tan, & Hong, 2013;Schirmer, Kotz, & Friederici, 2002;Schirmer, Striano, & Friederici, 2005;Schirmer, Zysset, Kotz, & Yves von Cramon, 2004). However, while we acknowledge the relevance of this result, the current study was not designed to address gender differences. ...
Preprint
Salient vocalizations, especially aggressive voices, are believed to attract attention due to an automatic threat detection system. However, studies assessing the temporal dynamics of auditory spatial attention to aggressive voices are missing. Using event-related potential markers of auditory spatial attention (N2ac and LPCpc), we show that attentional processing of threatening vocal signals is enhanced at two different stages of auditory processing. As early as 200 ms post stimulus onset, attentional orienting/engagement is enhanced for threatening as compared to happy vocal signals. Subsequently, as early as 400 ms post stimulus onset, the reorienting of auditory attention to the center of the screen (or disengagement from the target) is enhanced. This latter effect is consistent with the need to optimize perception by balancing the intake of stimulation from left and right auditory space. Our results extend the scope of theories from the visual to the auditory modality by showing that threatening stimuli also bias early spatial attention in the auditory modality. Although not the focus of the present work, we observed that the attentional enhancement was more pronounced in female than male participants.
Chapter
Neuroscientific research on emotion has developed dramatically over the past decade. The cognitive neuroscience of human emotion, which has emerged as the new and thriving area of 'affective neuroscience', is rapidly rendering existing overviews of the field obsolete. This handbook provides a comprehensive, up-to-date and authoritative survey of knowledge and topics investigated in this cutting-edge field. It covers a range of topics, from face and voice perception to pain and music, as well as social behaviors and decision making. The book considers and interrogates multiple research methods, among them brain imaging and physiology measurements, as well as methods used to evaluate behavior and genetics. Editors Jorge Armony and Patrik Vuilleumier have enlisted well-known and active researchers from more than twenty institutions across three continents, bringing geographic as well as methodological breadth to the collection. This timely volume will become a key reference work for researchers and students in the growing field of neuroscience.
Chapter
Neuroscientific research on emotion has developed dramatically over the past decade. The cognitive neuroscience of human emotion, which has emerged as the new and thriving area of 'affective neuroscience', is rapidly rendering existing overviews of the field obsolete. This handbook provides a comprehensive, up-to-date and authoritative survey of knowledge and topics investigated in this cutting-edge field. It covers a range of topics, from face and voice perception to pain and music, as well as social behaviors and decision making. The book considers and interrogates multiple research methods, among them brain imaging and physiology measurements, as well as methods used to evaluate behavior and genetics. Editors Jorge Armony and Patrik Vuilleumier have enlisted well-known and active researchers from more than twenty institutions across three continents, bringing geographic as well as methodological breadth to the collection. This timely volume will become a key reference work for researchers and students in the growing field of neuroscience.
Chapter
Neuroscientific research on emotion has developed dramatically over the past decade. The cognitive neuroscience of human emotion, which has emerged as the new and thriving area of 'affective neuroscience', is rapidly rendering existing overviews of the field obsolete. This handbook provides a comprehensive, up-to-date and authoritative survey of knowledge and topics investigated in this cutting-edge field. It covers a range of topics, from face and voice perception to pain and music, as well as social behaviors and decision making. The book considers and interrogates multiple research methods, among them brain imaging and physiology measurements, as well as methods used to evaluate behavior and genetics. Editors Jorge Armony and Patrik Vuilleumier have enlisted well-known and active researchers from more than twenty institutions across three continents, bringing geographic as well as methodological breadth to the collection. This timely volume will become a key reference work for researchers and students in the growing field of neuroscience.
Chapter
Neuroscientific research on emotion has developed dramatically over the past decade. The cognitive neuroscience of human emotion, which has emerged as the new and thriving area of 'affective neuroscience', is rapidly rendering existing overviews of the field obsolete. This handbook provides a comprehensive, up-to-date and authoritative survey of knowledge and topics investigated in this cutting-edge field. It covers a range of topics, from face and voice perception to pain and music, as well as social behaviors and decision making. The book considers and interrogates multiple research methods, among them brain imaging and physiology measurements, as well as methods used to evaluate behavior and genetics. Editors Jorge Armony and Patrik Vuilleumier have enlisted well-known and active researchers from more than twenty institutions across three continents, bringing geographic as well as methodological breadth to the collection. This timely volume will become a key reference work for researchers and students in the growing field of neuroscience.
Preprint
Full-text available
Background: Effective doctor-patient interaction plays a crucial role in fostering trust between doctors and patients. This study aimed to construct an interactive positive evolution model of doctor-patient trust by incorporating key influencing factors (interaction needs and interaction modes) in the doctor-patient interaction context, drawing from the interpersonal trust model. This new model includes three trust stages: the bounded rational trust stage, the knowledge-based trust stage, and the identification-based trust stage. Methods: To validate the efficacy of the new model, interactive empirical experiments were conducted using functional near-infrared spectroscopy (fNIRS). Results: The behavior and fNIRS results revealed that the knowledge-based trust stage, characterized by mutual understanding, exhibited higher levels of doctor-patient trust, accompanied by increased inter-brain synchronization in the left inferior frontal gyrus, compared to the bounded rational trust stage. Furthermore, the identification-based trust stage, characterized by mutual positive feedback, demonstrated the highest levels of doctor-patient trust, with increased inter-brain synchronization observed in the bilateral temporoparietal junction and the right inferior frontal gyrus. Especially, the inter-brain synchronization results in the left temporoparietal junction region were positively correlated with the doctor’s trust. Conclusions: Therefore, we confirmed the effectiveness of the interactive, positive evolution model of doctor-patient trust and provided neural evidence that predicts the development of doctor-patient trust. This trust model offers valuable guidance for both medical student education and patient education, illuminating the path toward more effective and empathetic healthcare practices.
Chapter
Differences between the two genders are widespread in nature (Fig. 17.1) and concern a variety of characteristics, including not only the external appearance but also inclinations and skills, as well as the way of thinking and the overall behavior.
Chapter
The origins of sex differences in human behavior have been extensively studied from various theoretical perspectives, and a growing body of evidence has suggested that organization of the brain, and subsequent sex-typed behaviors, are influenced by exposure to sex hormones and the expression of specific genes during early development. Methodological advances in the study of biological bases of sex differences have shed light on mechanisms that influence sex development across the life span, though many questions remain. This chapter provides a general overview of biological approaches to the study of sex differences, with summaries of findings to date and future directions. The emphasis of the chapter is on hormones because that has been the major focus of biological approaches to sex development for decades. The chapter also touches on genetics toward the end given some important emerging work disentangling hormonal effects from genetic effects.KeywordsAndrogensBrain functionBrain structureComplete androgen insensitivity syndromeCongenital adrenal hyperplasiaGender identityGender roleSexual differentiationSex determinationSexual orientation
Article
Full-text available
Purpose This study aimed to examine the Stroop effects of verbal and nonverbal cues and their relative impacts on gender differences in unisensory and multisensory emotion perception. Method Experiment 1 investigated how well 88 normal Chinese adults (43 women and 45 men) could identify emotions conveyed through face, prosody and semantics as three independent channels. Experiments 2 and 3 further explored gender differences during multisensory integration of emotion through a cross-channel (prosody-semantics) and a cross-modal (face-prosody-semantics) Stroop task, respectively, in which 78 participants (41 women and 37 men) were asked to selectively attend to one of the two or three communication channels. Results The integration of accuracy and reaction time data indicated that paralinguistic cues (i.e., face and prosody) of emotions were consistently more salient than linguistic ones (i.e., semantics) throughout the study. Additionally, women demonstrated advantages in processing all three types of emotional signals in the unisensory task, but only preserved their strengths in paralinguistic processing and showed greater Stroop effects of nonverbal cues on verbal ones during multisensory perception. Conclusions These findings demonstrate clear gender differences in verbal and nonverbal emotion perception that are modulated by sensory channels, which have important theoretical and practical implications. Supplemental Material https://doi.org/10.23641/asha.16435599
Article
Accurate interpretation of speech requires the integration of verbal and nonverbal signals. This study investigated sex differences in behavior and neural activities associated with the integration of semantic content and emotional speech prosody, while the level of autistic traits was controlled for. Adults listened to Cantonese words spoken with happy and sad prosody, and made judgments on semantic valence while event-related potentials (ERP) were recorded. Behaviorally, men were slower than women in making semantic valence judgments. At the neural level, men had a greater congruity effect in the N400 component, whereas women had a greater congruity effect in the 1150–1300 ms time window for happy prosodies. There was no effect of sex in case of sad prosodies. Our study reveals novel findings on sex differences in the timing of the integration between verbal and non-verbal signals that cannot be explained by differences in autistic traits.
Article
The neurobiology of sex differences during language processing has been widely investigated in the past three decades. While substantial sex differences have been reported, empirical findings however appear largely equivocal. The present systematic review of the literature and meta-analysis aimed to determine the degree of agreement among studies reporting sex differences in cortical activity during language processing. Irrespective of the modality and the specificity of the language task, sex differences in the BOLD signal or cerebral blood flow was highly inconsistent across fMRI and PET studies. On the temporal side, earlier latency of auditory evoked responses for female compared to male participants were consistently observed in EEG studies during both listening and speaking. Overall, the present review and meta-analysis support the theoretical assumption that there are much more similarities than differences between men and women in the human brain during language processing. Subtle but consistent temporal differences are however observed in the auditory processing of phonetic cues during speech perception and production.
Article
Background: Arterial disruption during brain surgery can cause devastating injuries to wide expanses of white and gray matter beyond the tumor resection cavity. Such damage may occur as a result of disrupting blood flow through en passage arteries. Identification of these arteries is critical to prevent unforeseen neurologic sequelae during brain tumor resection. In this study, we discuss one such artery, termed the artery of aphasia (AoA), which when disrupted can lead to receptive and expressive language deficits. Methods: We performed a retrospective review of all patients undergoing an awake craniotomy for resection of a glioma by the senior author from 2012 to 2018. Patients were included if they experienced language deficits secondary to postoperative infarction in the left posterior temporal lobe in the distribution of the AoA. The gross anatomy of the AoA was then compared with activation likelihood estimations of the auditory and semantic language networks using coordinate-based meta-analytic techniques. Results: We identified 4 patients with left-sided posterior temporal artery infarctions in the distribution of the AoA on diffusion-weighted magnetic resonance imaging. All 4 patients developed substantial expressive and receptive language deficits after surgery. Functional language improvement occurred in only 2/4 patients. Activation likelihood estimations localized parts of the auditory and semantic language networks in the distribution of the AoA. Conclusions: The AoA is prone to blood flow disruption despite benign manipulation. Patients seem to have limited capacity for speech recovery after intraoperative ischemia in the distribution of this artery, which supplies parts of the auditory and semantic language networks.
Chapter
Men and women differ, not only in their anatomy but also in their behavior. Research using animal models has convincingly shown that sex differences in the brain and behavior are induced by sex hormones during a specific, hormone-sensitive period during early development. Thus, male-typical psychosexual characteristics seem to develop under the influence of testosterone, mostly acting during early development. By contrast, female-typical psychosexual characteristics may actually be organized under the influence of estradiol during a specific prepubertal period. The sexual differentiation of the human brain also seems to proceed predominantly under the influence of sex hormones. Recent studies using magnetic resonance imaging have shown that several sexually differentiated aspects of brain structure and function are female-typical in women with complete androgen insensitivity syndrome (CAIS), who have a 46 XY karyotype but a female phenotype due to complete androgen resistance, suggesting that these sex differences most likely reflect androgen action, although feminizing effects of estrogens or female-typical socialization cannot be ruled out. By contrast, some male-typical neural characteristics were also observed in women with CAIS suggesting direct effects of sex chromosome genes in the sexual differentiation of the human brain. In conclusion, the sexual differentiation of the human brain is most likely a multifactorial process including both sex hormone and sex chromosome effects, acting in parallel or in combination.
Article
Full-text available
This qualitative study used written narrativesto examine gender and age patterns in body image,emotional expression, and self-esteem for a total of 209boys and girls in the fifth, eighth, and twelfth grades. Seventy-six percent of the sample wasCaucasian, 18% African-American, 5% Asian-American, and.5% Hispanic. A major finding indicates that boysrestrict emotional expression from early adolescence through late adolescence, while girls increaseemotional expression during the same age period. Anothermajor finding suggests that girls in late childhood andadolescence are both more negatively and more positively influenced than boys by body image.Both boys' and girls' feelings about themselves areprimarily influenced in gender-stereotypedways.
Article
Full-text available
Summarizes results of 75 studies that reported accuracy for males and females at decoding nonverbal communication. The following attributes of the studies were coded: year, sample size, age of judges, sex of stimulus person, age of stimulus person, and the medium and channel of communication (e.g., photos of facial expressions, filtered speech). These attributes were examined in relation to 3 outcome indices: direction of effect, effect size (in standard deviation units), and significance level. Results show that more studies found a female advantage than would occur by chance, the average effect was of moderate magnitude and was significantly larger than zero, and more studies reached a conventional level of significance than would be expected by chance. The gender effect for visual-plus-auditory studies was significantly larger than for visual-only and auditory-only studies. The magnitude of the effect did not vary reliably with sample size, age of judges, sex of stimulus person, or age of stimulus person. (60 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Full-text available
We used event-related functional magnetic resonance imaging to identify brain regions involved in syntactic and semantic processing. Healthy adult males read well-formed sentences randomly intermixed with sentences which either contained violations of syntactic structure or were semantically implausible. Reading anomalous sentences, as compared to well-formed sentences, yielded distinct patterns of activation for the two violation types. Syntactic violations elicited significantly greater activation than semantic violations primarily in superior frontal cortex. Semantically incongruent sentences elicited greater activation than syntactic violations in the left hippocampal and parahippocampal gyri, the angular gyri bilaterally, the right middle temporal gyrus, and the left inferior frontal sulcus. These results demonstrate that syntactic and semantic processing result in nonidentical patterns of activation, including greater frontal engagement during syntactic processing and larger increases in temporal and temporo-parietal regions during semantic analyses.
Article
Full-text available
This study examined the neural areas involved in the recognition of both emotional prosody and phonemic components of words expressed in spoken language using echo-planar, functional magnetic resonance imaging (fMRI). Ten right-handed males were asked to discriminate words based on either expressed emotional tone (angry, happy, sad, or neutral) or phonemic characteristics, specifically, initial consonant sound (bower, dower, power, or tower). Significant bilateral activity was observed in the detection of both emotional and verbal aspects of language when compared to baseline activity. We found that the detection of emotion compared with verbal detection resulted in significant activity in the right inferior frontal lobe. Conversely, the detection of verbal stimuli compared with the detection of emotion activated left inferior frontal lobe regions most significantly. Specific analysis of the anterior auditory cortex revealed increased right hemisphere activity during the detection of emotion compared to activity during verbal detection. These findings illustrate bilateral involvement in the detection of emotion in language while concomitantly showing significantly lateralized activity in both emotional and verbal detection, in both the temporal and frontal lobes.
Article
Full-text available
In the companion to this paper (E. Zarahn, G. K. Aguirre, and M. D'Esposito, 1997,NeuroImage,179–197), we describe an implementation of a general linear model for autocorrelated observations in which the voxel-wise false-positive rates in fMRI “noise” datasets were stabilized and brought close to theoretical values. Here, implementations of the model are tested for use with statistical parametric mapping analysis of spatially smoothed fMRI data. Analyses using varying models of intrinsic temporal autocorrelation and either including or excluding a global signal covariate were conducted upon human subject data collected under null hypothesis as well as under experimental conditions. We found that smoothing with an empirically derived impulse response function (IRF), combined with a model of the intrinsic temporal autocorrelation in spatially smoothed fMRI data, resulted in a map-wise false-positive rate which did not exceed a 5% level when a nominal α = 0.05 tabular threshold was applied. Use of other models of intrinsic temporal autocorrelation resulted in map-wise false-positive rates that significantly exceeded this level. fMRI data collected while subjects performed a behavioral task were used to examine (a) task-dependent global signal changes and (b) the dependence of sensitivity on the temporal smoothing kernel and inclusion/exclusion of a global signal covariate. The global signal changes within an fMRI dataset were shown to be influenced by the performance of a behavioral task. However, the inclusion of this measure as a covariate did not have an adverse affect upon our measure of sensitivity. Finally, use of an empirically derived estimate of the IRF of the system was shown to result in greater map-wise sensitivity for signal changes than the use of a broader (in time) Poisson (parameter = 8 s) kernel.
Article
Full-text available
A number of neuroimaging findings have been interpreted as evidence that the left inferior frontal gyrus (IFG) subserves retrieval of semantic knowledge. We provide a fundamentally different interpretation, that it is not retrieval of semantic knowledge per se that is associated with left IFG activity but rather selection of information among competing alternatives from semantic memory. Selection demands were varied across three semantic tasks in a single group of subjects. Functional magnetic resonance imaging signal in overlapping regions of left IFG was dependent on selection demands in all three tasks. In addition, the degree of semantic processing was varied independently of selection demands in one of the tasks. The absence of left IFG activity for this comparison counters the argument that the effects of selection can be attributed solely to variations in degree of semantic retrieval. Our findings suggest that it is selection, not retrieval, of semantic knowledge that drives activity in the left IFG.
Article
Full-text available
Neuroimaging and neuropsychological studies have implicated left inferior prefrontal cortex (LIPC) in both semantic and phonological processing. In this study, functional magnetic resonance imaging was used to examine whether separate LIPC regions participate in each of these types of processing. Performance of a semantic decision task resulted in extensive LIPC activation compared to a perceptual control task. Phonological processing of words and pseudowords in a syllable-counting task resulted in activation of the dorsal aspect of the left inferior frontal gyrus near the inferior frontal sulcus (BA 44/45) compared to a perceptual control task, with greater activation for nonwords compared to words. In a direct comparison of semantic and phonological tasks, semantic processing preferentially activated the ventral aspect of the left inferior frontal gyrus (BA 47/45). A review of the literature demonstrated a similar distinction between left prefrontal regions involved in semantic processing and phonological/lexical processing. The results suggest that a distinct region in the left inferior frontal cortex is involved in semantic processing, whereas other regions may subserve phonological processes engaged during both semantic and phonological tasks.
Article
Full-text available
Neuroimaging studies have revealed an association between word generation and activity in the left inferior frontal gyrus (IFG) that is attentuated with item repetition. The experiment reported here examined the effects of repeated word generation, under conditions in which completion was either decreased or increased, on activity measured during whole-brain echoplanar functional magnetic resonance imaging. Activity in left IFG decreased during repetition conditions that reduced competition but increased during repetition conditions that increased competition; this pattern was contrasted to repetition effects observed in other cortical areas, specifically regions of left temporal cortex. The increase in left IFG activity, which is not predicted by a simple semantic retrieval account of prefrontal function, is consistent with the hypothesis that left IFG subserves the selection of semantic knowledge among competing alternatives.
Article
Full-text available
Most neuroimaging studies of language function to date use a block-subtraction paradigm in which images acquired during relatively long periods of target stimuli are compared to those acquired during a control period. These studies typically require an overt response on the part of the subject, usually some type of discrimination or grammatical judgment by button press, or silent word generation. Results from studies of syntactic and semantic processing have generally been compatible with the classical correlation to Broca's area and Wernicke's area, respectively. Recently, functional magnetic resonance imaging (fMRI) studies departing from the block-subtraction paradigm in favor of event-related fMRI paradigms have been reported. We have extended the use of this approach to examine implicit (i.e., without an explicit task on the part of the subject) syntactic and semantic processing at the phrasal level, using visually presented verb phrases. Left BA 44 is more strongly activated for the syntactic condition than the semantic condition. BA 45, 10, and 46 show laterality differences: mostly left-lateralized for the syntactic condition and right-lateralized for the semantic condition. We also find activations of the inferior parietal lobe, consistent with a visual oddball response reported previously, and the anterior cingulate gyrus (BA 32), implicated for attention and memory-related processes in numerous studies.
Article
Full-text available
Two coordinated experiments using functional Magnetic Resonance Imaging (fMRI) investigated whether the brain represents language form (grammatical structure) separately from its meaning content (semantics). While in the scanner, 14 young, unimpaired adults listened to simple sentences that were either nonanomalous or contained a grammatical error (for example, *Trees can grew.), or a semantic anomaly (for example, *Trees can eat.). A same⁄different tone pitch judgment task provided a baseline that isolated brain activity associated with linguistic processing from background activity generated by attention to the task and analysis of the auditory input. Sites selectively activated by sentence processing were found in both hemispheres in inferior frontal, middle, and superior frontal, superior temporal, and temporo-parietal regions. Effects of syntactic and semantic anomalies were differentiated by some nonoverlapping areas of activation: Syntactic anomaly triggered significantly increased activity in and around Broca's area, whereas semantic anomaly activated several other sites anteriorly and posteriorly, among them Wernicke's area. These dissociations occurred when listeners were not required to attend to the anomaly. The results confirm that linguistic operations in sentence processing can be isolated from nonlinguistic operations and support the hypothesis of a specialization for syntactic processing.
Article
Full-text available
Extracting meaning from speech requires the use of pragmatic, semantic, and syntactic information. A central question is: Does the processing of these different types of linguistic information have common or distinct neuroanatomical substrates? We addressed this issue using functional magnetic resonance imaging (fMRI) to measure neural activity when subjects listened to spoken normal sentences contrasted with sentences that had either (A) pragmatical, (B) semantic (selection restriction), or (C) syntactic (subcategorical) violations sentences. All three contrasts revealed robust activation of the left-inferior-temporal/fusiform gyrus. Activity in this area was also observed in a combined analysis of all three experiments, suggesting that it was modulated by all three types of linguistic violation. Planned statistical comparisons between the three experiments revealed (1) a greater difference between conditions in activation of the left-superior-temporal gyrus for the pragmatic experiment than the semantic/syntactic experiments; (2) a greater difference between conditions in activation of the right-superior and middle-temporal gyrus in the semantic experiment than in the syntactic experiment; and (3) no regions activated to a greater degree in the syntactic experiment than in the semantic experiment. These data show that, while left- and right-superior-temporal regions may be differentially involved in processing pragmatic and lexico-semantic information within sentences, the left-inferior-temporal/fusiform gyrus is involved in processing all three types of linguistic information. We suggest that this region may play a key role in using pragmatic, semantic (selection restriction), and subcategorical information to construct a higher representation of meaning of sentences.
Article
Full-text available
In this study we have attempted to define the neural circuits differentially activated by cognitive interference. We used event-related functional magnetic resonance imaging (fMRI) to identify areas of the brain that are activated by the Stroop word-color task in two experiments. In the first experiment, we used infrequent, incongruent colored word stimuli to elicit strong Stroop interference (the 'conventional Stroop' paradigm). In the second experiment, we used infrequent, congruent colored words (the 'inverse Stroop' paradigm) to confirm that the regions identified in the first experiment were in fact specifically related to the Stroop effect and not to nonspecific oddball effects associated with the use of infrequent stimuli. Performance of the conventional Stroop specifically activated the anterior cingulate, insula, premotor and inferior frontal regions. These activated regions in the current experiment are consistent with those activated in fMRI experiments that use a more traditional block design. Finally, analysis of the time course of fMRI signal changes demonstrated differential onset and offset of signal changes in these activated regions. The time course results suggest that the action of various brain areas can be temporally dissociated.
Article
Full-text available
Previous neuroimaging studies have shown that activation in left inferior prefrontal cortices (LIPC) is reduced during repeated (primed) relative to initial (unprimed) stimulus processing. These reductions in anterior (approximately BA 45/47) and posterior (approximately BA 44/6) LIPC activation have been interpreted as reflecting implicit memory for initial semantic or phonological processing. However, prior studies do not unambiguously indicate that LIPC priming effects are specific to the recapitulation of higher-level (semantic and/or phonological), rather than lower-level (perceptual), processes. Moreover, no prior study has shown that the patterns of priming in anterior and posterior LIPC regions are dissociable. To address these issues, the present fMRI study examined the nature of priming in LIPC by examining the task-specificity of these effects. Participants initially processed words in either a semantic or a nonsemantic manner. Subsequently, participants were scanned while they made semantic decisions about words that had been previously processed in a semantic manner (within-task repetition), words that had been previously processed in a nonsemantic manner (across-task repetition), and words that had not been previously processed (novel words). Behaviorally, task-specific priming was observed: reaction times to make the semantic decision declined following prior semantic processing but not following prior nonsemantic processing of a word. Priming in anterior LIPC paralleled these results with signal reductions being observed following within-task, but not following across-task, repetition. Importantly, neural priming in posterior LIPC demonstrated a different pattern: priming was observed following both within-task and across-task repetition, with the magnitude of priming tending to be greater in the within-task condition. Direct comparison between anterior and posterior LIPC regions revealed a significant interaction. These findings indicate that anterior and posterior LIPC demonstrate distinct patterns of priming, with priming in the anterior region being task-specific, suggesting that this facilitation derives from repeated semantic processing of a stimulus.
Article
Full-text available
Is there a single executive process or are there multiple executive processes that work together towards the same goal in some task? In these experiments, we use counter switching and response inhibition tasks to examine the neural underpinnings of two cognitive processes that have often been identified as potential executive processes: the switching of attention between tasks, and the resolution of interference between competing task responses. Using functional magnetic resonance imaging (fMRI), for both event-related and blocked design tasks, we find evidence for common neural areas across both tasks in bilateral parietal cortex (BA 40), left dorsolateral prefrontal cortex (DLPFC; BA 9), premotor cortex (BA 6) and medial frontal cortex (BA 6/32). However, we also find areas preferentially involved in the switching of attention between mental counts (BA 7, BA 18) and the inhibition of a prepotent motor response (BA 6, BA 10), respectively. These findings provide evidence for the separability of cognitive processes underlying executive control.
Article
A simple procedure for analyzing multi-subject functional magnetic resonance imaging (fMRI) data is proposed. In the first step, a voxel-wise t-test across standardized z-maps is performed to identify areas that are consistently activated across subjects. In the second step, for each area, individual mean z-scores are calculated and subsequently subjected to an analysis of variance. An example is provided. J. Magn. Reson. Imaging 2000;11:61–64. © 2000 Wiley-Liss, Inc.
Article
A novel method is presented for acquiring multislice T1-weighted images. The method utilizes non-slice-selective inversion pulses followed by a series of slice-selective excitations. k-space is divided into a number of segments equal to the number of slices. Successive segments of k-space are assigned to successive slice-selective pulses, and the order in which the slices are excited is manipulated to ensure that images of each slice have identical contrast and point spread function (PSF). This method is applied to the MDEFT experiment, a particular version of the inversion recovery experiment. The implications of this acquisition scheme on the PSF are examined, and it is shown that, provided the k-space modulation function does not change sign, a good PSF is achieved. For a given maximum number of slices, the total experimental duration depends only on TR and the number of phase-encoding steps. A method of accelerating the experiment by multiply exciting each slice is described. An experimental demonstration of the proposed sequences is given by imaging the human head at 3 T. J. Magn. Reson. Imaging 2000;11:445–451. © 2000 Wiley-Liss, Inc.
Article
In this paper, we present the concept of diffusing models to perform image-to-image matching. Having two images to match, the main idea is to consider the objects boundaries in one image as semi-permeable membranes and to let the other image, considered as a deformable grid model, diffuse through these interfaces, by the action of effectors situated within the membranes. We illustrate this concept by an analogy with Maxwell's demons. We show that this concept relates to more traditional ones, based on attraction, with an intermediate step being optical flow techniques. We use the concept of diffusing models to derive three different non-rigid matching algorithms, one using all the intensity levels in the static image, one using only contour points, and a last one operating on already segmented images. Finally, we present results with synthesized deformations and real medical images, with applications to heart motion tracking and three-dimensional inter-patients matching.
Article
The Stroop and Simon tasks typify a class of interference effects in which the introduction of task-irrelevant stimulus characteristics robustly slows reaction times. Behavioral studies have not succeeded in determining whether the neural basis for the resolution of these interference effects during successful task performance is similar or different across tasks. Event-related functional magnetic resonance imaging (fMRI) studies were obtained in 10 healthy young adults during performance of the Stroop and Simon tasks. Activation during the Stroop task replicated findings from two earlier fMRI studies. These activations were remarkably similar to those observed during the Simon task, and included anterior cingulate, supplementary motor, visual association, inferior temporal, inferior parietal, inferior frontal, and dorsolateral prefrontal cortices, as well as the caudate nuclei. The time courses of activation were also similar across tasks. Resolution of interference effects in the Simon and Stroop tasks engage similar brain regions, and with a similar time course. Therefore, despite the widely differing stimulus characteristics employed by these tasks, the neural systems that subserve successful task performance are likely to be similar as well. © 2002 Elsevier Science B.V. All rights reserved.
Article
This study compared and contrasted semantic priming in the visual and auditory modalities using event-related brain potentials (ERPs) and behavioural measures (errors and reaction time). Subjects participated in two runs (one visual, one auditory) of a lexical decision task where stimuli were word pairs consisting of “prime” words followed by equal numbers of words semantically related to the primes, words unrelated to the primes, pseudo-words, and nonwords. Subjects made slower responses, made more errors, and their ERPs had larger negative components (N400) to unrelated words than to related words in both modalities. However, the ERP priming effect began earlier, was larger in size, and lasted longer in the auditory modality than in the visual modality. In addition, the lateral distribution of N400 over the scalp differed in the two modalities. It is suggested that there may be overlap in the priming processes that occur in each modality but that these processes are not identical. The results also demonstrated that the N400 component may be specifically responsive to language or potential language events.
Article
Dual functional brain asymmetry refers to the notion that in most individuals the left cerebral hemisphere is specialized for language functions, whereas the right cerebral hemisphere is more important than the left for the perception, construction, and recall of stimuli that are difficult to verbalize. In the last twenty years there have been scattered reports of sex differences in degree of hemispheric specialization. This review provides a critical framework within which two related topics are discussed: Do meaningful sex differences in verbal or spatial cerebral lateralization exist? and, if so, Is the brain of one sex more symmetrically organized than the other? Data gathered on right-handed adults are examined from clinical studies of patients with unilateral brain lesions; from dichotic listening, tachistoscopic, and sensorimotor studies of functional asymmetries in non-brain-damaged subjects; from anatomical and electrophysiological investigations, as well as from the developmental literature. Retrospective and descriptive findings predominate over prospective and experimental methodologies. Nevertheless, there is an impressive accummulation of evidence suggesting that the male brain may be more asymmetrically organized than the female brain, both for verbal and nonverbal functions. These trends are rarely found in childhood but are often significant in the mature organism.
Code
This corpus contains ASCII versions of the CELEX lexical databases of English (Version 2.5), Dutch (Version 3.1) and German (Version 2.0). CELEX was developed as a joint enterprise of the University of Nijmegen, the Institute for Dutch Lexicology in Leiden, the Max Planck Institute for Psycholinguistics in Nijmegen, and the Institute for Perception Research in Eindhoven. Pre-mastering and production was done by the LDC.
Article
Statistical parametric maps are spatially extended statistical processes that are used to test hypotheses about regionally specific effects in neuroimaging data. The most established sorts of statistical parametric maps (e.g., Friston et al. [1991]: J Cereb Blood Flow Metab 11:690–699; Worsley et al. [1992]: J Cereb Blood Flow Metab 12:900–918) are based on linear models, for example ANCOVA, correlation coefficients and t tests. In the sense that these examples are all special cases of the general linear model it should be possible to implement them (and many others) within a unified framework. We present here a general approach that accomodates most forms of experimental layout and ensuing analysis (designed experiments with fixed effects for factors, covariates and interaction of factors). This approach brings together two well established bodies of theory (the general linear model and the theory of Gaussian fields) to provide a complete and simple framework for the analysis of imaging data. The importance of this framework is twofold: (i) Conceptual and mathematical simplicity, in that the same small number of operational equations is used irrespective of the complexity of the experiment or nature of the statistical model and (ii) the generality of the framework provides for great latitude in experimental design and analysis.
Article
To study the effects of gender on ability to recognize facial expressions of emotion, two separate samples of male and female undergraduates (727 in Study 1, 399 in Study 2) judged 120 color photographs of people posing one of four negative emotions: anger, disgust, fear, and sadness. Overall, females exceeded males in their ability to recognize emotions whether expressed by males or by females. As an exception, males were superior to females in recognizing male anger. The findings are discussed in terms of social sex-roles.
Article
We present an approach to characterizing the differences among event-related hemodynamic responses in functional magnetic resonance imaging that are evoked by different sorts of stimuli. This approach is predicated on a linear convolution model and standard inferential statistics as employed by statistical parametric mapping. In particular we model evoked responses, and their differences, in terms of basis functions of the peri-stimulus time. This facilitates a characterization of the temporal response profiles that has a high effective temporal resolution relative to the repetition time. To demonstrate the technique we examined differential responses to visually presented words that had been seen prior to scanning or that were novel. The form of these differences involved both the magnitude and the latency of the response components. In this paper we focus on bilateral ventrolateral prefrontal responses that show deactivations for previously seen words and activations for novel words.
Article
A novel method is presented for acquiring multislice T1-weighted images. The method utilizes non-slice-selective inversion pulses followed by a series of slice-selective excitations. k-space is divided into a number of segments equal to the number of slices. Successive segments of k-space are assigned to successive slice-selective pulses, and the order in which the slices are excited is manipulated to ensure that images of each slice have identical contrast and point spread function (PSF). This method is applied to the MDEFT experiment, a particular version of the inversion recovery experiment. The implications of this acquisition scheme on the PSF are examined, and it is shown that, provided the k-space modulation function does not change sign, a good PSF is achieved. For a given maximum number of slices, the total experimental duration depends only on TR and the number of phase-encoding steps. A method of accelerating the experiment by multiply exciting each slice is described. An experimental demonstration of the proposed sequences is given by imaging the human head at 3 T. J. Magn. Reson. Imaging 2000;11:445–451.
Article
Three experiments are reported on visual field asymmetries in the perception of emotional expressions on the face. In experiment I full faces expressing six different emotions were presented unilaterally for exposure durations, allowing the subject to judge whether the facial expression was positive or negative. Right-handed subjects judged all expressions except happiness as more negative when presented in the left visual field (LVF). This effect was smaller for left-handers and was absent in left-handers who use the non-inverted writing posture. In experiment II subjects were presented with happy, sad and "mixed" chimeric faces, projected to each visual field, for durations allowing only the detection of the existence of a face. LVF presentations produced greater differential rating of emotional valence for the three types of stimuli. In experiment III chimeric faces containing happy and sad expressions were presented unilaterally for durations allowing the subject to perceive the existence of two expressions on the face. The subjects were required to decide whether the mood expressed in the face was predominantly negative or positive. RVF presentations resulted in a bias toward positive judgments. These results indicate right hemispheric superiority for the perception and processing of emotional valence and a left hemispheric perceptual bias toward positive aspects of emotional stimuli.
Article
In a sentence reading task, words that occurred out of context were associated with specific types of event-related brain potentials. Words that were physically aberrant (larger than normal) elecited a late positive series of potentials, whereas semantically inappropriate words elicited a late negative wave (N400). The N400 wave may be an electrophysiological sign of the "reprocessing" of semantically anomalous information.
Article
The contents of six neuropsychology journals (98 volumes, 368 issues) were screened to identify visual half-field (VHF) experiments. Of the 516 experiments identified, 42% provided information about sex differences. Sixty-eight experiments yielded a total of 92 sex differences, 23 of which met stringent criteria for sex differences in laterality. Of the 20 sex differences satisfying stringent criteria and lending themselves to interpretation in terms of the differential lateralization hypothesis, 17 supported the hypothesis of greater hemispheric specialization in males than in females. The 17 confirmatory outcomes represent 7.8% of the informative experiments. When less stringent criteria were invoked, 27 outcomes (12.3% of the informative experiments) were found to be consistent with the differential lateralization hypothesis. Six findings were contrary to the hypothesis. The results, which closely resemble results for auditory laterality studies, are compatible with a population-level sex difference that accounts for 1 to 2% of the variance in laterality.
Article
The entire contents of six neuropsychology journals (98 volumes, 368 issues) were screened to identify auditory laterality experiments. Of the 352 dichotic and monaural listening experiments identified, 40% provided information about sex differences. Among the 49 experiments that yielded at least one significant effect or interaction involving the sex factor, 11 outcomes met stringent criteria for sex differences in laterality. Of those 11 positive outcomes, 9 supported the hypothesis of greater hemispheric specialization in males than in females. The 9 confirmatory outcomes represent 6.4% of the informative experiments. When less stringent criteria were invoked, 21 outcomes (14.9% of the informative experiments) were found to be consistent with the differential lateralization hypothesis. The overall pattern of results is compatible with a weak population-level sex difference in hemispheric specialization.
Article
This article reviews the preliminary experiences and the results obtained on the human brain at 4 T at the University of Minnesota. Anatomical and functional images are presented. Contrary to initial expectations and the early results, it is possible to obtain high-resolution images of the human brain with exquisite T1 contrast, delineating structures especially in the basal ganglia and thalamus, which were not observed clearly in 1.5-T images until now. These 4-T images are possible using a new approach that achieves maximal contrast for different T1 values at approximately the same repetition time and has built-in tolerance to variations in B1 magnitude. For functional images, the high field provides increased contribution from the venuoles and the capillary bed because the susceptibility-induced alterations in 1/T2* from these small-diameter vessels increase quadratically with the magnitude of the main field. Images obtained with short echo times at 4 T, and by implication at lower fields with correspondingly longer echo times, are expected to be dominated by contributions from large venous vessel or in-flow effects from the large arteries; such images are undesirable because of their poor spatial correspondence with actual sites of neuronal activity.
Article
Defects in expressing or understanding the affective or emotional tone of speech (aprosodias) have been associated with right hemisphere dysfunction, while defects of propositional language have been linked to left hemisphere disease. The brain regions involved in recognition of emotional prosody in healthy subjects is less clear. To investigate the brain regions involved in understanding emotional prosody and to determine whether these differ from those involved in understanding emotion based on propositional content. We studied 13 healthy subjects using water labeled with radioactive oxygen 15 and positron emission tomography while they listened to 3 similar sets of spoken English sentences. In different tasks, their responses were based on the emotional propositional content, on the emotional intonation of the sentence (prosody), or on their ability to repeat the second word in the sentence (control). Understanding propositional content activated the prefrontal cortex bilaterally, on the left more than on the right. In contrast, responding to the emotional prosody activated the right prefrontal cortex. Neurologically healthy subjects activate right hemisphere regions during emotional prosody recognition.
Article
Friston et al. (1995, NeuroImage 2:45-53) presented a method for detecting activations in fMRI time-series based on the general linear model and a heuristic analysis of the effective degrees of freedom. In this communication we present corrected results that replace those of the previous paper and solve the same problem without recourse to heuristic arguments. Specifically we introduce a proper and unbiased estimator for the error terms and provide a more generally correct expression for the effective degrees of freedom. The previous estimates of error variance were biased and, in some instances, could have led to a 10-20% overestimate of Z values. Although the previous results are almost correct for the random regressors chosen for validation, the present theoretical results are exact for any covariate or waveform. We comment on some aspects of experimental design and data analysis, in the light of the theoretical framework discussed here.
Article
In the companion to this paper (E. Zarahn, G. K. Aguirre, and M. D'Esposito, 1997, NeuroImage, 179-197), we describe an implementation of a general linear model for autocorrelated observations in which the voxel-wise false-positive rates in fMRI "noise" datasets were stabilized and brought close to theoretical values. Here, implementations of the model are tested for use with statistical parametric mapping analysis of spatially smoothed fMRI data. Analyses using varying models of intrinsic temporal autocorrelation and either including or excluding a global signal covariate were conducted upon human subject data collected under null hypothesis as well as under experimental conditions. We found that smoothing with an empirically derived impulse response function (IRF), combined with a model of the intrinsic temporal autocorrelation in spatially smoothed fMRI data, resulted in a map-wise false-positive rate which did not exceed a 5% level when a nominal alpha = 0.05 tabular threshold was applied. Use of other models of intrinsic temporal autocorrelation resulted in map-wise false-positive rates that significantly exceeded this level. fMRI data collected while subjects performed a behavioral task were used to examine (a) task-dependent global signal changes and (b) the dependence of sensitivity on the temporal smoothing kernel and inclusion/exclusion of a global signal covariate. The global signal changes within an fMRI dataset were shown to be influenced by the performance of a behavioral task. However, the inclusion of this measure as a covariate did not have an adverse affect upon our measure of sensitivity. Finally, use of an empirically derived estimate of the IRF of the system was shown to result in greater map-wise sensitivity for signal changes than the use of a broader (in time) Poisson (parameter = 8 s) kernel.
Article
Temporal autocorrelation, spatial coherency, and their effects on voxel-wise parametric statistics were examined in BOLD fMRI null-hypothesis, or "noise," datasets. Seventeen normal, young subjects were scanned using BOLD fMRI while not performing any time-locked experimental behavior. Temporal autocorrelation in these datasets was described well by a 1/frequency relationship. Voxel-wise statistical analysis of these noise datasets which assumed independence (i.e., ignored temporal autocorrelation) rejected the null hypothesis at a higher rate than specified by the nominal alpha. Temporal smoothing in conjunction with the use of a modified general linear model (Worsley and Friston, 1995, NeuroImage 2: 173-182) brought the false-positive rate closer to the nominal alpha. It was also found that the noise fMRI datasets contain spatially coherent time signals. This observed spatial coherence could not be fully explained by a continuously differentiable spatial autocovariance function and was much greater for lower temporal frequencies. Its presence made voxel-wise test statistics in a given noise dataset dependent, and thus shifted their distributions to the right or left of 0. Inclusion of a "global signal" covariate in the general linear model reduced this dependence and consequently stabilized (i.e., reduced the variance of) dataset false-positive rates.
Article
In this paper, we present the concept of diffusing models to perform image-to-image matching. Having two images to match, the main idea is to consider the objects boundaries in one image as semi-permeable membranes and to let the other image, considered as a deformable grid model, diffuse through these interfaces, by the action of effectors situated within the membranes. We illustrate this concept by an analogy with Maxwell's demons. We show that this concept relates to more traditional ones, based on attraction, with an intermediate step being optical flow techniques. We use the concept of diffusing models to derive three different non-rigid matching algorithms, one using all the intensity levels in the static image, one using only contour points, and a last one operating on already segmented images. Finally, we present results with synthesized deformations and real medical images, with applications to heart motion tracking and three-dimensional inter-patients matching.
Article
A novel method is presented for acquiring multislice T1-weighted images. The method utilizes non-slice-selective inversion pulses followed by a series of slice-selective excitations. k-space is divided into a number of segments equal to the number of slices. Successive segments of k-space are assigned to successive slice-selective pulses, and the order in which the slices are excited is manipulated to ensure that images of each slice have identical contrast and point spread function (PSF). This method is applied to the MDEFT experiment, a particular version of the inversion recovery experiment. The implications of this acquisition scheme on the PSF are examined, and it is shown that, provided the k-space modulation function does not change sign, a good PSF is achieved. For a given maximum number of slices, the total experimental duration depends only on TR and the number of phase-encoding steps. A method of accelerating the experiment by multiply exciting each slice is described. An experimental demonstration of the proposed sequences is given by imaging the human head at 3 T.
Article
The Stroop interference task requires a person to respond to a specific dimension of a stimulus while suppressing a competing stimulus dimension. Previous PET and fMRI studies using the Color Stroop paradigm have shown increased activity in the "cognitive division" of the cingulate cortex. In our fMRI study with nine subjects, we used a Color-Word Matching Stroop task. A frontoparietal network, including structures in the lateral prefrontal cortex, the frontopolar region, the intraparietal sulcus, as well as the lateral occipitotemporal gyrus, was activated when contrasting the incongruent vs the neutral condition. However, no substantial activation in either the right or left hemisphere of the anterior cingulate cortex (ACC) was detected. In accordance with a series of recent articles, we argue that the ACC is not specifically involved in interference processes. The ACC seems rather involved in motor preparation processes which were controlled in the present Color-Word Matching Stroop task. We argue that the region around the banks of the inferior frontal sulcus is required to solve interference problems, a concept which can also be seen as a component of task set management.
Article
Prefrontal cortex plays a central role in mnemonic control, with left inferior prefrontal cortex (LIPC) mediating control of semantic knowledge. One prominent theory posits that LIPC does not mediate semantic retrieval per se, but rather subserves the selection of task-relevant knowledge from amidst competing knowledge. The present event-related fMRI study provides evidence for an alternative hypothesis: LIPC guides controlled semantic retrieval irrespective of whether retrieval requires selection against competing representations. With selection demands held constant, LIPC activation increased with semantic retrieval demands and with the level of control required during retrieval. LIPC mediates a top-down bias signal that is recruited to the extent that the recovery of meaning demands controlled retrieval. Selection may reflect a specific instantiation of this mechanism.
Article
This paper describes the non-commercial software system LIPSIA that was developed for the processing of functional magnetic resonance images (fMRI) of the human brain. The analysis of fMRI data comprises various aspects including filtering, spatial transformation, statistical evaluation as well as segmentation and visualization. In LIPSIA, particular emphasis was placed on the development of new visualization and segmentation techniques that support visualizations of individual brain anatomy so that experts can assess the exact location of activation patterns in individual brains. As the amount of data that must be handled is enormous, another important aspect in the development LIPSIA was the efficiency of the software implementation. Well established statistical techniques were used whenever possible.
Article
While numerous studies have implicated both anterior cingulate and prefrontal cortex in attentional control, the nature of their involvement remains a source of debate. Here we determine the extent to which their relative involvement in attentional control depends upon the levels of processing at which the conflict occurs (e.g., response, non-response). Using a combination of blocked and rapid presentation event-related functional magnetic resonance imaging techniques, we compared neural activity during incongruent Stroop trial types that produce conflict at different levels of processing. Our data suggest that the involvement of anterior cingulate and right prefrontal cortex in attentional control is primarily limited to situations of response conflict, while the involvement of left prefrontal cortex extends to the occurrence of conflict at non-response levels.
Article
In this chapter, we discuss our research that reveals how attentional mechanisms can modulate activity of posterior brain regions responsible for processing the unattended attribute of a stimulus. To do so, we utilized fMRI to reveal patterns of regional brain activity for variants of the Stroop task that differ in the nature of the task-irrelevant stimulus attribute. In all variants, individuals had to identify the ink color in which an item was presented. To vary attentional demands, we manipulated whether or not the task-irrelevant information contained conflicting color information. The variants differed in whether the conflicting color information was contained in a word naming a color (e.g. the word 'red' in blue ink), a word naming an object highly associated with a specific color (e.g. the word 'frog' in red ink), or a line drawing of an object highly associated with a specific color (e.g. a drawing of a frog in red ink). When the unattended stimulus attribute contained color information that conflicted with an item's ink color, increased activity was observed in the posterior brain region that processes the aspect of the task-irrelevant attribute related to color. Increased activity was observed in the left precuneus and left superior parietal cortex when the conflicting information arose from a color word; in the middle temporal gyrus and insular cortex when the word named an object highly associated with a specific color, and included extensive regions of early portions of the ventral visual processing stream when a line drawing was highly associated with a specific color. These areas have been implicated in word processing, semantic processing, and visual processing, respectively. Our results suggest that attentional selection can occur by: (1) increasing the gain on all posterior regions responsible for processing information related to the task demands, regardless of whether that information is contained in the task-relevant or task-irrelevant dimension; (2) limiting the processing of task-irrelevant information in order to reduce interference; and (3) modulating the processing of representations varying from those of a low-level perceptual nature up through those of a higher-order semantic nature.
Article
Appreciation of the emotional tone of verbal utterances represents an important aspect of social life. It is still unsettled, however, which brain areas mediate processing of intonational information and whether the presumed right-sided superiority depends upon acoustic properties of the speech signal. Functional magnetic resonance imaging was used to disentangle brain activation associated with (i) extraction of specific acoustic cues and (ii) detection of specific emotional states. Stimulus material comprised pairs of emotionally intonated utterances, exclusively differing either in pitch range or in the length of stressed vowels. Hemodynamic responses showed a dynamic pattern of cerebral activation including sequenced bilateral responses of various cortical and subcortical structures. Activation associated with discrimination of emotional expressiveness predominantly emerged within the right inferior parietal lobule, within the bilateral mesiofrontal cortex and--with an asymmetry toward the right hemisphere--at the level of bilateral dorsolateral frontal cortex. Lateralization did not depend upon acoustic structure or emotional valence of stimuli. These findings might prove helpful in reconciling the controversial previous clinical and experimental data.
Article
The Stroop and Simon tasks typify a class of interference effects in which the introduction of task-irrelevant stimulus characteristics robustly slows reaction times. Behavioral studies have not succeeded in determining whether the neural basis for the resolution of these interference effects during successful task performance is similar or different across tasks. Event-related functional magnetic resonance imaging (fMRI) studies were obtained in 10 healthy young adults during performance of the Stroop and Simon tasks. Activation during the Stroop task replicated findings from two earlier fMRI studies. These activations were remarkably similar to those observed during the Simon task, and included anterior cingulate, supplementary motor, visual association, inferior temporal, inferior parietal, inferior frontal, and dorsolateral prefrontal cortices, as well as the caudate nuclei. The time courses of activation were also similar across tasks. Resolution of interference effects in the Simon and Stroop tasks engage similar brain regions, and with a similar time course. Therefore, despite the widely differing stimulus characteristics employed by these tasks, the neural systems that subserve successful task performance are likely to be similar as well.
Article
The meaning of a speech stream is communicated by more than the particular words used by the speaker. For example, speech melody, referred to as prosody, also contributes to meaning. In a cross-modal priming study we investigated the influence of emotional prosody on the processing of visually presented positive and negative target words. The results indicate that emotional prosody modulates word processing and that the time-course of this modulation differs for males and females. Women show behavioural and electrophysiological priming effects already with a small interval between the prosodic prime and the visual target word. In men, however, similar effects of emotional prosody on word processing occur only for a longer interval between prime and target. This indicates that women make an earlier use of emotional prosody during word processing as compared to men.
Article
Previous neuroimaging studies of the Stroop task have postulated that the anterior cingulate cortex (ACC) plays a critical role in resolution of the Stroop interference condition. However, activation of the ACC is not invariably seen and appears to depend on a variety of methodological factors, including the degree of response conflict and response expectancies. The present functional MRI study was designed to identify those brain areas critically involved in the interference condition. Healthy subjects underwent a blocked-trial design fMRI experiment while responding to 1 of 3 stimulus conditions: (1) incongruent color words, (2) congruent color words, and (3) color-neutral words. Subjects responded to the printed color of the word via a manual response. Compared to the congruent and neutral conditions, the incongruent condition produced significant activation within the left inferior precentral sulcus (IpreCS) located on the border between the inferior frontal gyrus, pars opercularis (BA 44) and the ventral premotor region (BA 6). Significant deactivations in the rostral component of the ACC and the posterior cingulate gyrus were also observed. Selective activation of the left IpreCS is compatible with findings from previous neuroimaging, lesion, electrophysiological, and behavioral studies and is presumably related to the mediation of competing articulatory demands during the interference condition.
Article
We report a random-effects analysis of an event-related fMRI study (n = 28) of cerebral activity during the reading of sentences that ended with a word that was either congruent or incongruent with the previous sentence context. Event-related potential studies have shown that this task elicits a late negativity peaking around 400 ms poststimulus (N400) that is larger for incongruent than for congruent sentence endings. A direct comparison of the activation for incongruent words versus that for congruent words revealed significantly greater activation for incongruent words than congruent words in bilateral inferior frontal and inferio-medial temporal cortex, left lateral frontal cortex, left posterior fusiform gyrus, bilateral motor cortex, and supplementary motor area. These results are consistent with data from intracranial electrical recording studies of the N400 electrical potential. The results are discussed as they relate to the localization of the cerebral sites underlying semantic processing in general and the localization of the scalp recorded N400 event-related potential in particular.
Article
Though lesions to frontal cortex can increase susceptibility to interference from previously established but irrelevant memories ("proactive interference"), the specific regions underlying this problem are difficult to determine because the lesions are typically large and heterogeneous. We used event-related functional magnetic resonance imaging to investigate proactive interference in healthy volunteers performing an "AB-AC" paired-associate cued-recall paradigm. At Study, participants intentionally encoded semantically related visual word pairs, which were changed three times (high interference), repeated three times (low interference), or presented only once. At Test, participants were presented with the first word of each pair and attempted to recall its most recent associate from the Study phase. To overcome the problem of image artifacts caused by speech-related head motion, we cued speech during a gap between image acquisitions. Regions in left inferior frontal cortex and bilateral frontopolar cortex showed interference effects during both Study and Test. The pattern of responses in these regions differed, however. Left inferior frontal regions showed mainly reduced responses associated with low interference, whereas frontopolar regions showed mainly increased responses associated with high interference. When incorrect as well as correct trials were analyzed at Test, additional activation associated with high interference was observed in right dorsolateral prefrontal cortex. These data suggest that distinct regions within prefrontal cortex subserve different functions in the presence of proactive interference during cued recall.
Article
For nearly two decades, functional neuroimaging studies have attempted to shed light on questions about the representation, organization, and retrieval of semantic knowledge. This review examines some of the major findings in this area. For example, functional neuroimaging studies have examined the extent to which there is a unitary semantic system or a series of multiple semantic subsystems organized by input modality, knowledge attribute, and/or taxonomic category. Additionally, functional neuroimaging studies have investigated the contributions of frontal cortex to semantic retrieval and selection. Collectively, these studies demonstrate that functional neuroimaging can offer more than neuroanatomical localization information; in addition, these studies offer new insights into longstanding questions about semantic memory.