Fig 1 - uploaded by Gerry Stefanatos
Content may be subject to copyright.
Magnetic resonance images showing encephalomalacia and gliosis in the left temporal region involving the planum polare, the transverse temporal (Heschl's) gyrus, and the planum temporale and subadjacent white matter. Abnormalities extended in the lateromedial axis from the dorsal surface of the superior temporal gyrus to the insula. This study was obtained 4 months post-onset.  

Magnetic resonance images showing encephalomalacia and gliosis in the left temporal region involving the planum polare, the transverse temporal (Heschl's) gyrus, and the planum temporale and subadjacent white matter. Abnormalities extended in the lateromedial axis from the dorsal surface of the superior temporal gyrus to the insula. This study was obtained 4 months post-onset.  

Source publication
Article
Full-text available
Pure word deafness (PWD) is a rare neurological syndrome characterized by severe difficulties in understanding and reproducing spoken language, with sparing of written language comprehension and speech production. The pathognomonic disturbance of auditory comprehension appears to be associated with a breakdown in processes involved in mapping audit...

Context in source publication

Context 1
... density, T1-, and T2-weighted magnetic resonance images were obtained using a 1.5 Tesla scanner (see Fig. 1). These scans revealed encephalomalacia and gliosis involv- ing the dorsal surface of the left temporal lobe extending into subadjacent white matter. This included the planum polare, the transverse temporal (Heschl's) gyrus, and the planum temporale. In the lateromedial plane, the abnormal- ities extended from the dorsal convexity of ...

Similar publications

Article
Full-text available
Background/aims: We have demonstrated previously that bipolar electrocoagulation on functional cortex (BCFC) is a safe and effective approach for epilepsy involving eloquent areas. Here, we report the results of BCFC with lesionectomy for patients with epileptogenic foci partially overlapping eloquent areas. Methods: Forty patients who had been...
Article
Full-text available
A writing treatment protocol was designed for a 75 year-old man with severe Wernicke's aphasia. Four treatment phases were implemented: (1) a multiple baseline design that documented improvement in single-word writing for targeted words; (2) a clinician-directed home program that increased the corpus of correctly-spelled single words; (3) another m...
Article
Full-text available
Survivors of either a hemorrhagic or ischemic stroke tend to acquire aphasia and experience spontaneous recovery during the first six months. Nevertheless, a considerable number of patients sustain aphasia and require speech and language therapy to overcome the difficulties. As a preliminary study, this article aims to distinguish aphasia caused fr...
Article
Full-text available
Intracranial pure arterial malformation (PAM) is a rare condition defined by the presence of tortuous and dilated arteries forming an arterial mass with a coil-like appearance, devoid of any venous component, as described by McLaughlin et al. [1] e reported incidence of PAM is low with only a limited number of cases documented in the literature. T...
Article
Full-text available
Background: Recently, verb–noun processing differences were reported in a group of late bilingual speakers with fluent, anomic aphasia in Greek (L1) as well as in English (L2) (Kambanaros & van Steenbrugge, 2006). The findings revealed that verb production was significantly more impaired than noun production in both languages during picture naming...

Citations

... Impairment of auditory phonological recognition associated with dysfunction of the auditory pathway from the brain stem to the cerebral cortex results in word deafness. Word deafness, the inability to process auditory speech input, has been mostly reported in patients with focal brain lesions, such as stroke and tumours (Auerbach, Allard, Naeser, Alexander, & Albert, 1982;Joswig, Sch€ onenberger, Brü gge, Richter, & Surbeck, 2015;Saffran, Marin, & Yeni-Komshian, 1976;Shivashankar, Shashikala, Nagaraja, Jayakumar, & Ratnavalli, 2001;Stefanatos, Gershkoff, & Madigan, 2005;Tanaka, Yamadori, & Mori, 1987). Word deafness was more frequent in patients with bilateral damage to the auditory pathway than in those with unilateral damage. ...
... Auditory perceptual impairments in the processing of rapid temporal changes in sound has been considered to lead to impaired speech perception, particularly consonant perception. Patients with word deafness due to focal lesions have difficulty perceiving words temporally (Auerbach et al., 1982;Poeppel, 2001;Slevc, Martin, Hamilton, & Joanisse, 2011;Stefanatos et al., 2005;Tanji et al., 2003). Further investigation of temporal auditory perception in the PPA will clarify the relationship between temporal processing and phonological perceptions. ...
... A dual-stream model of the functional anatomy of language indicates that the bilateral dorsal surface of the superior temporal gyrus is involved in spectrotemporal analysis, whereas the bilateral middle to posterior portions of the superior temporal gyrus are involved in phonological processing (Hickok & Poeppel, 2007). Word deafness in patients with stroke and epilepsy has been reported to include focal lesions in the bilateral or left superior temporal gyrus (Maffei et al., 2017;Poeppel, 2001;Shijo et al., 2014;Slevc et al., 2011;Stefanatos et al., 2005). Patients with slowly progressive word deafness or auditory agnosia due to neurodegenerative diseases, including PPA, show brain atrophy or reduced rCBF/glucose metabolism in the bilateral superior temporal gyrus, predominantly on the left (Kawakami et al., 2022;Otsuki et al., 1998;Watanabe et al., 2020). ...
... These domains were underpinned by dissociable pLTC regions consistent with previous meta-analytic, theoretical, and neuropsychological work (Bates et al. 2003;Hickok andPoeppel 2004, 2007;Rogers et al. 2004;Vigneau et al. 2006;Binder et al. 2009;Friederici 2009Friederici , 2011Noonan et al. 2013;Mesulam et al. 2014;Jackson 2020). A discrete region of bilateral STG was implicated in phonology; this region is associated with processing speech sounds (Binder et al. 2000;Scott et al. 2000) and damage here is linked to conduction aphasia (Damasio and Damasio 1980;Hickok 2000;Buchsbaum et al. 2011) and pure word deafness (Poeppel 2001;Stefanatos et al. 2005). In contrast, a swathe of the left pSTS extending dorsally to the ventral edge of AG, was implicated in general semantic processing (due to its involvement in semantics, yet not semantic control), which overlapped only minimally with phonology; further, in both hemispheres, subtraction analyses indicate that the STG shows significantly greater activation likelihood for phonology than semantics. ...
Article
Full-text available
The posterior lateral temporal cortex is implicated in many verbal, nonverbal, and social cognitive domains and processes. Yet without directly comparing these disparate domains, the region’s organization remains unclear; do distinct processes engage discrete subregions, or could different domains engage shared neural correlates and processes? Here, using activation likelihood estimation meta-analyses, the bilateral posterior lateral temporal cortex subregions engaged in 7 domains were directly compared. These domains comprised semantics, semantic control, phonology, biological motion, face processing, theory of mind, and representation of tools. Although phonology and biological motion were predominantly associated with distinct regions, other domains implicated overlapping areas, perhaps due to shared underlying processes. Theory of mind recruited regions implicated in semantic representation, tools engaged semantic control areas, and faces engaged subregions for biological motion and theory of mind. This cross-domain approach provides insight into how posterior lateral temporal cortex is organized and why.
... It is mostly documented in adults with acute stroke or head trauma but has also been reported in children with Landau-Kleffner syndrome (Baynes et al., 1998) and brain tumor (Chou et al., 2011). Literature also reveals cases of PWD caused by osmotic demyelination syndromes (Garde & Cowey, 1998;Zhu et al., 2010), neurodegenerative diseases (Kim et al., 2011;Lizuka et al., 2007;Otsuki et al., 1998) multiple sclerosis (Tabira et al., 1981;Jónsdóttir et al., 19 � 98 � ), encephalitis (Goldstein, 1974;Arias et al., 1995;Kasselimis et al., 2017) seizures (Fung et al., 2000;Stefanatos et al., 2005), mitochondrial encephalomyopathy (Miceli et al., 2008), Creutzfeldt-Jakob disease (Hillis & Selnes, 1999;Tobias et al., 1994), and drug toxicity (Donaldson et al., 1981). ...
... Patients' subjective descriptions are intriguing and seem to also vary in published reports suggesting a continuum of severity (Stefanatos et al., 2005). Some case studies highlight marked impedance in recognizing the acoustic characteristics of a human voice while others have alluded to subtler difficulties mapping the acoustic features of speech to lexical representations (Stefanatos et al., 2005). ...
... Patients' subjective descriptions are intriguing and seem to also vary in published reports suggesting a continuum of severity (Stefanatos et al., 2005). Some case studies highlight marked impedance in recognizing the acoustic characteristics of a human voice while others have alluded to subtler difficulties mapping the acoustic features of speech to lexical representations (Stefanatos et al., 2005). Historically, speech has been described as "a great noise all the time . . . ...
Article
Here we present a case of a native Greek male patient who presented clinically with sudden onset pure word deafness after an ischemic stroke in the temporoparietal region of the right hemisphere, but who had suffered an ischemic stroke 9 years previously in an adjacent area of the left hemisphere, causing aphasic symptoms which resolved quickly and almost completely. What makes this case interesting and novel is that it is the first case describing a patient whose ventral language comprehension circuit did not reorganize successfully due to damage of the adjacent right hemisphere language areas at a subsequent time.
... c o r t e x 1 3 7 ( 2 0 2 1 ) 2 5 5 e2 7 0 Hagoort et al., 1991;Simons & Ralph, 2008;Simons, Sambeth et al., 2011;Slevc & Shell, 2015;Szirmai, Farsang et al., 2003;Tanaka, Kamo, Yoshida, & Yamadori, 1991;Tanaka, Yamadori, & Mori, 1987;Verma & Post, 2013). There are also case reports of AA (more or less verbal AA), caused by isolated left temporal lobe lesions (Clarke, Bellmann et al., 2000;Gazzaniga, Glass, Sarno, & Posner, 1973;Maffei et al., 2017;Takahashi et al., 1992;Nakakoshi, Kashino et al., 2001;Palma, Lamet et al., 2012;Poeppel, 2001;Saygin, Leech et al., 2010;Slevc, Martin et al., 2011;Stefanatos, Gershkoff et al., 2005), right temporal cortico-subcortical lesion (usually AA for non-verbal sounds, especially music) (Baird, Walker et al., 2014;Griffiths, Rees et al., 1997;Mazzucchi, Marchini et al., 1982;Mendez, 2001;Poeppel, 2001;Steinke, Cuddy et al., 2001;Yamamoto, Kikuchi et al., 2004), purely subcortical lesions unilaterally (Hayashi & Hayashi, 2007;Suh, Shin et al., 2012;Tanaka, Nakano et al., 2002), or bilaterally (Shivashankar, Shashikala, Nagaraja, Jayakumar, & Ratnavalli, 2001Sugiura & Torii, 2017;Tanaka, Kamo, Yoshida, & Yamadori, 1991;Taniwaki, Tagawa, Sato, & Iino, 2000;Tokida et al., 2017) and even brainstem lesions (Chou, Liao et al., 2011;Joswig, Schonenberger et al., 2015;Kimiskidis, Lalaki et al., 2004;Pan, Kuo et al., 2004;Poliva, Bestelmeyer et al., 2015). If a left-sided lesion was sufficient for clinical expression of generalised AA, one explanation could lie in the subcortical extension of the left-sided lesion, affecting the interhemispheric fibres responsible for the transmission of auditory information from the right to the left hemispheres (Geschwind, 1965;Suh, Shin et al., 2012;Zatorre, Evans et al., 1992). ...
... Among 63 single patients with verbal AA and sufficiently detailed lesion data, almost 70% had bilateral lesions (Slevc & Shell, 2015). In rare cases, verbal AA has been attributed to left hemispheric lesions (Clarke, Bellmann, Meuli, Assal, & Steck, 2000;Maffei et al., 2017;Nakakoshi, Kashino, Mizobuchi, Fukada, & Katori, 2001;Palma, Lamet, Riverol, & Martinez-Lage, 2012;Poeppel, 2001;Saygin, Leech, & Dick, 2010;Slevc, Martin, Hamilton, & Joanisse, 2011;Stefanatos, Gershkoff, & Madigan, 2005;Takahashi, Kawamura et al., 1992), presumably due to disconnection of Wernicke's area from auditory input. There is now considerable evidence, based on functional neuroimaging data of speech processing in healthy individuals, that the early stages of speech processing and speech recognition are bilaterally organised (Hickok & Poeppel, 2007). ...
... d) Pattern of recovery of AA: The symptoms in our patients evolved from generalized AA and followed a distinct pattern of recovery which, to the best of our knowledge, has not been described in similar cases (Bahls, Chatrian et al., 1988;Brody, Nicholas et al., 2013;Chermak GD, 1997;Clark WLG, 1938;Graham, Greenwood et al., 1980;Tanaka, Yamadori et al., 1987). When generalized AA or CD improve, environmental sound agnosia and amusia are usually first to resolve, patients retaining verbal agnosia (;Bauer & Zawacki, 2001;Engelien et al., 1995;Mendez, 2001;Poeppel, 2001;Stefanatos, Gershkoff, & Madigan, 2005). Also, amusia is usually the least prominent deficit among the three forms of AA, althought this migh be partially due to its unrecognition (Slevc & Shell, 2015). ...
Article
A 66-year-old female medical doctor suffered two consecutive cardioembolic strokes, initially affecting the right frontal lobe and the right insula, followed by a lesion in the left temporal lobe. The patient presented with distinctive phenomenology of general auditory agnosia with anosognosia for the deficit. She did not understand verbal commands and her answers to oral questions were fluent but unrelated to the topic. However, she was able to correctly answer written questions, name objects, and fluently describe their purpose, which is characteristic for verbal auditory agnosia. She was also unable to recognise environmental sounds or to recognise and repeat any melody. This inability is suggestive of environmental sound agnosia and amusia, respectively. Surprisingly, she was not aware of the problems, not asking any questions regarding her symptoms, and avoiding discussing her inability to understand spoken language, which is indicative of anosognosia. The deficits in our patient evolved from generalized AA with distinct pattern of recovery. The verbal auditory agnosia was the first to resolve, followed by environmental sound agnosia. Amusia persisted the longest. The patient was clinically assessed from the first day of symptom onset and the evolution of symptoms was video documented. We give a detailed account of the patient’s behaviour and provide results of audiological and neuropsychological evaluations. We discuss the anatomy of auditory agnosia and anosognosia relevant to the case. This case study may serve to better understand auditory agnosia in clinical settings. It is important to distinguish AA from Wernicke’s aphasia, because use of written language may enable normal communication.
... Pure word deafness typically arises from bilateral temporal lesions; one such patient was reported to be able to learn over 100 signs of American Sign Language and communicate via visual language modalities (Kirshner & Webb, 1981). Cases with unilateral temporal lesions have also been reported (Stefanatos, Gershkoff, & Madigan, 2005; Takahashi et al., 1992). Geschwind interpreted pure word deafness as a disconnection syndrome, in which both primary auditory cortices ("Heschl's gyrus," part of the superior temporal gyrus) were cut off from Wernicke's area, such that sounds could be heard but not processed as language. ...
Chapter
This chapter is organized along the neurobehavioral features of the major stroke syndromes in relation to the major vascular territories. We describe the major neurobehavioral deficits of each syndrome and associate them with specific distributions and pathological mechanisms of ischemic and hemorrhagic strokes. This chapter will consider intracerebral hemorrhages, but syndromes related to subarachnoid hemorrhage, aneurysms, and arteriovenous malformations will be discussed in Chaps. 4 and 5. It will be apparent that often specific cognitive disturbances are not restricted to injury to one particular vascular territory. For convenience, we will discuss various cognitive disturbances in relation to the vascular territories most commonly associated with these deficits.
... We assessed our patient's ability to recognize non-verbal sounds. Twenty environmental audio recordings consisting of the following four sound categories were presented to both ears: human non-verbal (e.g., baby crying), manmade inanimate (e.g., running water), non-human animate (e.g., dog barking), and natural inanimate (e.g., wind) (12,13). After hearing each sound, she was asked to name the environmental sound. ...
... After the naming task, the patient was asked to match one of the four pictures to a presented sound (12,13). The four controls easily identified the correct answers and achieved a common score of 20.0. ...
Article
Full-text available
Cortical neurodegeneration-induced non-fluent/agrammatic variant of primary progressive aphasia (nfvPPA) is a clinical syndrome characterized by non-fluent speech, such as apraxia of speech or agrammatism. We describe the case of an 80-year-old right-handed woman who exhibited nfvPPA. Atypically, our patient also presented with generalized auditory agnosia. Brain magnetic resonance imaging revealed left-sided predominant atrophy of the bilateral perisylvian area, including the inferior frontal and superior temporal lobes. In a series of auditory tasks assessing generalized auditory agnosia, our patient was unable to accurately identify verbal sounds, environmental sounds, or familiar Japanese songs that she could sing. In the context of recent studies, our study indicates the existence of a clinical syndrome characterized by progressive speech disorder with auditory agnosia. This case report thus provides novel insights into the spectrum of language impairment induced by neurodegenerative disease.
... 36,37 Damage to projections from the secondary auditory cortex into the posterior Sylvian fissure results in pure word deafness with preserved nonspeech hearing. 38,39 In the language-dominant hemisphere, the area receiving these projections, area Spt, plays a role in sensorimotor integration as the beginning of the dorsal language stream. Area Spt is located in the Wernicke area (Fig 1); however, damage to area Spt usually results in conductive aphasia. ...
Article
Functional MR imaging is being performed with increasing frequency in the typical neuroradiology practice; however, many readers of these studies have only a limited knowledge of the functional anatomy of the brain. This text will delineate the locations, anatomic boundaries, and functions of the cortical regions of the brain most commonly encountered in clinical practice-specifically, the regions involved in movement and language.
... It is possible that the presence of the viseme effect in the N1 window allowed us to observe predictive processing more closely (through perturbing the predictive system so to speak), and that this is why effects of sentence context on predicting word form became evident only in conjunction with the viseme effect. This finding adds to a growing body of research that demonstrates an organization of language processing within the ventral stream which is bilateral, but with a hemispheric asymmetry in activation 65,66 . Behavioural research also supports the idea that the left and right hemisphere both process visual information, but in slightly different manners. ...
Article
Full-text available
In language comprehension, a variety of contextual cues act in unison to render upcoming words more or less predictable. As a sentence unfolds, we use prior context (sentential constraints) to predict what the next words might be. Additionally, in a conversation, we can predict upcoming sounds through observing the mouth movements of a speaker (visual constraints). In electrophysiological studies, effects of visual constraints have typically been observed early in language processing, while effects of sentential constraints have typically been observed later. We hypothesized that the visual and the sentential constraints might feed into the same predictive process such that effects of sentential constraints might also be detectable early in language processing through modulations of the early effects of visual salience. We presented participants with audiovisual speech while recording their brain activity with magnetoencephalography. Participants saw videos of a person saying sentences where the last word was either sententially constrained or not, and began with a salient or non-salient mouth movement. We found that sentential constraints indeed exerted an early (N1) influence on language processing. Sentential modulations of the N1 visual predictability effect were visible in brain areas associated with semantic processing, and were differently expressed in the two hemispheres. In the left hemisphere, visual and sentential constraints jointly suppressed the auditory evoked field, while the right hemisphere was sensitive to visual constraints only in the absence of strong sentential constraints. These results suggest that sentential and visual constraints can jointly influence even very early stages of audiovisual speech comprehension.
... Pure word deafness typically arises from bilateral temporal lesions; one such patient was reported to be able to learn over 100 signs of American Sign Language and communicate via visual language modalities (Kirshner & Webb, 1981). Cases with unilateral temporal lesions have also been reported (Stefanatos, Gershkoff, & Madigan, 2005; Takahashi et al., 1992). Geschwind interpreted pure word deafness as a disconnection syndrome, in which both primary auditory cortices ("Heschl's gyrus," part of the superior temporal gyrus) were cut off from Wernicke's area, such that sounds could be heard but not processed as language. ...
... We did not expect to find a hemispheric interaction; however, the topography of the interaction appears convincingly symmetric, suggesting that it is constrained to the same brain area. This adds to a growing body of research that demonstrates an organization of language processing within the ventral stream which is bilateral, but with a hemispheric asymmetry in activation (Hickok & Poeppel, 2007;Stefanatos, Gershkoff, & Madigan, 2005). Behavioural research also supports the idea that the left and right hemisphere both process visual information, but in slightly different manners. ...
Preprint
Full-text available
In language comprehension, a variety of contextual cues act in unison to render upcoming words more or less predictable. As a sentence unfolds, we use prior context (sentential constraints) to predict what the next words might be. Additionally, in a conversation, we can predict upcoming sounds through observing the mouth movements of a speaker (visual constraints). In electrophysiological studies, effects of visual salience have typically been observed early in language processing, while effects of sentential constraints have typically been observed later. We hypothesized that the visual and the sentential constraints might feed into the same predictive process such that effects of sentential constraints might also be detectable early in language processing through modulations of the early effects of visual salience. We presented participants with audiovisual speech while recording their brain activity with magnetoencephalography. Participants saw videos of a person saying sentences where the last word was either sententially constrained or not, and began with a salient or non-salient mouth movement. We found that sentential constraints indeed exerted an early (N1) influence on language processing. Sentential modulations of the N1 visual predictability effect were visible in brain areas associated with semantic processing, and were differently expressed in the two hemispheres. In the left hemisphere, visual and sentential constraints jointly suppressed the auditory evoked field, while the right hemisphere was sensitive to visual constraints only in the absence of strong sentential constraints. These results suggest that sentential and visual constraints can jointly influence even very early stages of audiovisual speech comprehension.