Karin Petrini

Karin Petrini
University of Bath | UB · Department of Psychology

PhD

About

101
Publications
11,642
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,106
Citations

Publications

Publications (101)
Article
Research has shown that children on the autism spectrum and adults with high levels of autistic traits are less sensitive to audiovisual asynchrony compared to their neurotypical peers. However, this evidence has been limited to simultaneity judgments (SJ) which require participants to consider the timing of two cues together. Given evidence of par...
Article
Full-text available
With technological developments, individuals are increasingly able to delegate tasks to autonomous agents that act on their behalf. This may cause individuals to behave more fairly, as involving an agent representative encourages individuals to strategise ahead and therefore adhere to social norms of fairness. Research suggests that an audio smilin...
Article
Full-text available
Unlabelled: Research has shown that high trait anxiety can alter multisensory processing of threat cues (by amplifying integration of angry faces and voices); however, it remains unknown whether differences in multisensory processing play a role in the psychological response to trauma. This study examined the relationship between multisensory emot...
Article
Full-text available
Attention is the ability to actively process specific information within one’s environment over longer periods of time while disregarding other details. Attention is an important process that contributes to overall cognitive performance from performing every day basic tasks to complex work activities. The use of virtual reality (VR) allows study of...
Article
Full-text available
Users’ emotions may influence the formation of presence in virtual reality (VR). Users’ expectations, state of arousal and personality may also moderate the relationship between emotions and presence. An interoceptive predictive coding model of conscious presence (IPCM) considers presence as a product of the match between predictions of interocepti...
Article
Full-text available
Reading music notation is not an easy skill to acquire and can take years of training to master. In addition, it is not strictly necessary to be able to read music to make music. Nevertheless, music teaching and learning in the West has traditionally centered around the skill of reading music. This study explored participants’ reasons for learning...
Article
Full-text available
Music involves different senses and is emotional in nature, and musicians show enhanced detection of audio-visual temporal discrepancies and emotion recognition compared to non-musicians. However, whether musical training produces these enhanced abilities or if they are innate within musicians remains unclear. Thirty-one adult participants were ran...
Article
Full-text available
Assistive technology (AT) devices are designed to help people with visual impairments (PVIs) perform activities that would otherwise be difficult or impossible. Devices specifically designed to assist PVIs by attempting to restore sight or substitute it for another sense have a very low uptake rate. This study, conducted in England, aimed to invest...
Article
Full-text available
The tongue is an incredibly complex sensory organ, yet little is known about its tactile capacities compared to the hands. In particular, the tongue receives almost no visual input during development and so may be calibrated differently compared to other tactile senses for spatial tasks. Using a cueing task, via an electro-tactile display, we exami...
Preprint
Perception of distance in virtual reality (VR) is compressed; that is, objects and the distance between them and the observer are consistently perceived as closer than intended by the designers of the VR environment. Although well documented, this phenomenon is still not fully understood or defined with respect to the factors influencing such compr...
Article
Background: Mild cognitive impairment (MCI) and dementia result in cognitive decline which can negatively impact everyday functional abilities and quality of life. Virtual reality (VR) interventions could benefit the cognitive abilities of people with MCI and dementia, but evidence is inconclusive. Objective: To investigate the efficacy of VR tr...
Article
Full-text available
Individuals are increasingly relying on GPS devices to orient and find their way in their environment and research has pointed to a negative impact of navigational systems on spatial memory. We used immersive virtual reality to examine whether an audio–visual navigational aid can counteract the negative impact of visual only or auditory only GPS sy...
Preprint
Full-text available
Individuals are increasingly relying on GPS devices to orient and find their way in their environment and research has pointed to a negative impact of navigational systems on spatial memory. We used immersive virtual reality to examine whether an audio-visual navigational aid can counteract the negative impact of visual only or auditory only GPS sy...
Article
Full-text available
People with blindness and visual impairments have reduced access to exercise compared to the general population during typical societal functioning. The Coronavirus-19 pandemic completely disrupted daily life for most individuals worldwide, and in the United Kingdom, a stay-at-home order was enforced. One of the sole reasons an individual could lea...
Article
Background: Emotion perception is essential to human interaction and relies on effective integration of emotional cues across sensory modalities. Despite initial evidence for anxiety-related biases in multisensory processing of emotional information, there is no research to date that directly addresses whether the mechanism of multisensory integra...
Article
In everyday life, information from multiple senses is integrated for a holistic understanding of emotion. Despite evidence of atypical multisensory perception in populations with socio-emotional difficulties (e.g., autistic individuals), little research to date has examined how anxiety impacts on multisensory emotion perception. Here we examined wh...
Article
Full-text available
Previous studies have revealed that attention and inhibition are impaired in individuals with elevated symptoms of depression and anxiety. Virtual reality (VR)-based neuropsychological assessment may be a valid instrument for assessing attention and inhibition given its higher ecological validity when compared to classical tests. However, it is sti...
Article
Objectives: This is a protocol for a Cochrane Review (intervention). The objectives are as follows:. To assess the effects of exergame applications on physical and cognitive outcomes, and activities of daily living (ADL), in people with dementia and mild cognitive impairment (MCI). Copyright © 2021 The Cochrane Collaboration. Published by John Wile...
Article
During self-guided movements, we optimise performance by combining sensory and self-motion cues optimally, based on their reliability. Discrepancies between such cues and problems in combining them are suggested to underlie some pain conditions. Therefore, we examined whether visuomotor integration is altered in twenty-two participants with upper o...
Article
Full-text available
The brain's ability to integrate information from the different senses is essential for decreasing sensory uncertainty and ultimately limiting errors. Temporal correspondence is one of the key processes that determines whether information from different senses will be integrated and is influenced by both experience- and task-dependent mechanisms in...
Article
Full-text available
Human adults can optimally combine vision with self-motion to facilitate navigation. In the absence of visual input (e.g., dark environments and visual impairments), sensory substitution devices (SSDs), such as The vOICe or BrainPort, which translate visual information into auditory or tactile information, could be used to increase navigation preci...
Article
Full-text available
Integrating different senses to reduce sensory uncertainty and increase perceptual precision can have an important compensatory function for individuals with visual impairment and blindness. However, how visual impairment and blindness impact the development of optimal multisensory integration in the remaining senses is currently unknown. Here we f...
Article
Full-text available
Music expertise has been shown to enhance emotion recognition from speech prosody. Yet, it is currently unclear whether music training enhances the recognition of emotions through other communicative modalities such as vision and whether it enhances the feeling of such emotions. Musicians and nonmusicians were presented with visual, auditory, and a...
Preprint
Full-text available
Integrating different senses to reduce sensory uncertainty and increase perceptual precision can have an important compensatory function for individuals with visual impairment and blindness. However, how visual impairment and blindness impact the development of optimal multisensory integration in the remaining senses is currently unknown. Here we f...
Article
In order to increase perceptual precision the adult brain dynamically combines redundant information from different senses depending on their reliability. During object size estimation, for example, visual, auditory and haptic information can be integrated to increase the precision of the final size estimate. Young children, however, do not integra...
Preprint
Previous research using reverse correlation to explore the relationship between brain activity and presented image information found that Face Fusiform Area (FFA) activity could be related to the appearance of faces during free viewing of the Hollywood movie The Good, the Bad, and the Ugly (Hasson, et al, 2004). We applied this approach to the natu...
Article
Full-text available
Long-term music training has been shown to affect different cognitive and perceptual abilities. However, it is less well known whether it can also affect the perception of emotion from music, especially purely rhythmic music. Hence, we asked a group of 16 non-musicians, 16 musicians with no drumming experience, and 16 drummers to judge the level of...
Article
Full-text available
Multisensory processing is a core perceptual capability, and the need to understand its neural bases provides a fundamental problem in the study of brain function. Both synchrony and temporal order judgments are commonly used to investigate synchrony perception between different sensory cues and multisensory perception in general. However, extensiv...
Article
Full-text available
To overcome differences in physical transmission time and neural processing, the brain adaptively recalibrates the point of simultaneity between auditory and visual signals by adapting to audiovisual asynchronies. Here, we examine whether the prolonged recalibration process of passively sensed visual and auditory signals is affected by naturally oc...
Poster
Optimal integration of multisensory information has frequently been shown to benefit perception by speeding up responses and increasing perceptual precision and accuracy. These effects, however, are often demonstrated in young adults, and to a lesser extent in older adults or children, with a scarcity of studies examining how optimal integration ch...
Chapter
“What does it mean, to see?” was the question that David Marr used to motivate his computation approach to understanding Vision (Marr, 1982). Marr's answer, building on Aristotle, was that “vision is the process of discovering from images what is present in the world, and where it is” (p. 3). Although we humans might have a preference for visual pe...
Poster
Full-text available
Humans are known to heavily rely on visual cues in detecting sensory information necessary for locomotion. In individuals with healthy vision, this modality is combined with non-visual cues such as self-motion to track their own movements over time and maintain the desired trajectory. This raises the question as to whether the visual modality can b...
Poster
Full-text available
Several behavioural studies have shown that, while adults integrate sensory information in a statistically optimal fashion, children as old as 8– 10 years exhibit sensory dominance. When discriminating object size for example, young children rely more on haptic than visual or auditory information (Gori et al., 2008; Petrini et al., 2014). However,...
Article
Using fMRI decoding techniques we recently demonstrated that early visual cortex contains content-specific information from sounds in the absence of visual stimulation (Vetter, Smith & Muckli, Current Biology, 2014). Here we studied whether the emotional valence of sounds can be decoded in early visual cortex during emotionally ambiguous visual sti...
Article
Full-text available
Human adults can optimally integrate visual and non-visual self-motion cues when navigating, while children up to 8 years old cannot. Whether older children can is unknown, limiting our understanding of how our internal multisensory representation of space develops. Eighteen adults and fifteen 10- to 11-year-old children were guided along a two-leg...
Article
Full-text available
We describe the creation of the first multisensory stimulus set that consists of dyadic, emotional, point-light interactions combined with voice dialogues. Our set includes 238 unique clips, which present happy, angry and neutral emotional interactions at low, medium and high levels of emotional intensity between nine different actor dyads. The set...
Article
To reduce sensory uncertainty, humans combine cues from multiple senses. However, in everyday life, many co-occurring cues are irrelevant to the task at hand. How do humans know which cues to ignore? And does this ability change with development? This study shows the ability to ignore cross-modal irrelevant information develops late in childhood. P...
Article
Full-text available
Human adults with normal vision can combine visual landmark and non-visual self-motion cues to improve their navigational precision. Here we asked whether blind individuals treated with a retinal prosthesis could also benefit from using the resultant new visual signal together with non-visual information when navigating. Four patients (blind for 15...
Article
Full-text available
Audiovisual perception of emotions has been typically examined using displays of a solitary character (e.g., the face-voice and/or body-sound of one actor). However, in real life humans often face more complex multisensory social situations, involving more than one person. Here we ask if the audiovisual facilitation in emotion recognition previousl...
Article
Human beings often observe other people's social interactions without being a part of them. Whereas the implications of some brain regions (e.g. amygdala) have been extensively examined, the implication of the precuneus remains yet to be determined. Here we examined the implication of the precuneus in third-person perspective of social interaction...
Conference Paper
Studying how we retrace our way in darkness provides a means of understanding the internal representations we use during navigation. In adults, path integration in darkness is influenced by previously presented visual information that conflicted with actual motion (Tcheang et al, PNAS, 2011, 108(3): 1152-7). Here we used immersive virtual reality t...
Conference Paper
The ability to integrate auditory and visual information is crucial to everyday life and there are mixed results regarding how Autism Spectrum Disorder (ASD) influences audiovisual integration. The audiovisual Temporal Integration Window (TIW) indicates how precisely sight and sound need to be temporally aligned to perceive a unitary audiovisual ev...
Article
Correctly localising sensory stimuli in space is a formidable challenge for the newborn brain. A new study provides a first glimpse into how human brain mechanisms for sensory remapping develop in the first year of life.
Conference Paper
http://iovs.arvojournals.org/article.aspx?articleid=2271615&resultClick=1
Article
Full-text available
When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory...
Conference Paper
The ability to integrate auditory and visual information is a crucial part of everyday life. The Temporal Integration Window (TIW) provides a measure of how much asynchrony can be tolerated between auditory and visual streams before one loses the perception of a unitary audiovisual event. Previous investigations of the TIW in individuals with Autis...
Conference Paper
To reduce sensory uncertainty and reach a decision, the brain can use sensory combination and acquired knowledge. Whereas adults can reduce uncertainty by combining sensory estimates, in recent studies children below 8 years failed to do so. Here we ask whether children’s multisensory processing is sensitive to prior knowledge about cue corresponde...
Conference Paper
Although there are major changes in spatial cognition in the first years of life, spatial abilities continue to develop significantly through mid-childhood (e.g. Nardini et al, Current Biology 2008). Research with adults using immersive virtual reality allows for precise cue control, and for use of sensory conflicts to study interactions between cu...
Article
Full-text available
Synchrony judgments involve deciding whether cues to an event are in synch or out of synch, while temporal order judgments involve deciding which of the cues came first. When the cues come from different sensory modalities these judgments can be used to investigate multisensory integration in the temporal domain. However, evidence indicates that th...
Chapter
Many discussions of biological motion perception involve a description of observers' attunements for recognizing gender, emotion, action, and identity from point-light displays. This chapter describes an often-neglected determinant of biological motion perception: the role of expertise. First, the authors describe how variability among observers is...
Conference Paper
In everyday life, localising a sound source is often necessary for our survival. It is now well known that when visual localisation is good and informative, vision dominates and captures sound (e.g. Alais and Burr, 2004). What is still unclear is whether an uninformative visual signal can bias sound localisation. Optimal use of sensory signals for...
Conference Paper
Full-text available
Background / Purpose: We investigated whether emotional context as communicated through auditory stimulation can influence the activity patterns in early visual cortex. Main conclusion: When viewing an emotionally ambiguous interaction between two point-light walkers, activity patterns in early visual areas are influenced by emotional informat...
Article
Full-text available
There is a lack of studies examining the effectiveness of some of the commonly used instruments to elicit the presence of social anxiety disorder (SAD) in Arab-speaking populations, such as those in Oman. The aim of this study was to establish the influence of social anxiety and the role of gender among Omani adolescents. Methods A two-phase prot...
Chapter
Full-text available
The ability to successfully integrate information from different senses is of paramount importance for perceiving the world and has been shown to change with experience. We first review how experience, in particular musical experience, brings about changes in our ability to fuse together sensory information about the world. We next discuss evidence...
Conference Paper
Multimodal perception of emotions has been typically examined using displays of a solitary character (e.g., the face–voice and/or body–sound of one actor). We extend investigation to more complex, dyadic point-light displays combined with speech. A motion and voice capture system was used to record twenty actors interacting in couples with happy, a...
Conference Paper
To perform everyday tasks, such as crossing a road, we greatly rely on our sight. However, certain situations (e.g., an extremely dark environment) as well as visual impairments can either reduce the reliability of or completely remove this sensory information. In these cases, the use of other information is vital. Here we seek to examine the devel...
Book
The ability to successfully integrate information from different senses is of paramount importance for perceiving the world and has been shown to change with experience. We first review how experience, in particular musical experience, brings about changes in our ability to fuse together sensory information about the world. We next discuss evidence...
Data
Example of one region detected by using interaction analysis AV>A+V. The region (left parahippocampal gyrus: x = −25; y = −18; z = −14) is presented on the left and the relative time courses on the right. L = left hemisphere; R = right hemisphere. (TIF)
Data
The axial slices show activation bilaterally in auditory (right: = 6.86, p = 0.00001, two-tailed; 53, −14, 5 (x, y, z); 11336 voxels; left: = 6.23, p = 0.00002, two-tailed; −48, −20, 7 (x, y, z); 6781 voxels) and visual areas (right: = 6.59, p = 0.00002, two-tailed; 32, −76, −8 (x, y, z); 24445 voxels; left: = 6.41, p = 0.00002, two-tailed; −31, −8...
Article
Full-text available
In humans, emotions from music serve important communicative roles. Despite a growing interest in the neural basis of music perception, action and emotion, the majority of previous studies in this area have focused on the auditory aspects of music performances. Here we investigate how the brain processes the emotions elicited by audiovisual music p...
Article
Full-text available
Aim: A temporal relationship exists between the presence of affective disturbance, poor glycaemic control and complications in people with type-2 diabetes. The objective of this study is to compare the performance of patients diagnosed with type-2 diabetes and normoactive group on indices of mood functioning and indices of health-related quality of...
Conference Paper
Multimodal aspects of non-verbal communication have thus far been examined using displays of a solitary character (e.g. the face-voice and/or body-sound of one actor). We extend investigation to more socially complex dyadic displays using point-light displays combined with speech sounds that preserve only prosody information. Two actors were record...
Article
Single cell data from macaque suggest special processing of the sights and sounds of biological actions (Kohler, Keysers, et al, Science 2002). Recently Arrighi, Alais & Burr (JOV, 2006) have examined this hypothesis using judgments of perceptual synchrony of audio and visual streams of conga drumming as well as with synthetic audio and visual stre...
Article
As an object is always viewed through some media (atmosphere), the intensity of light coming to the eye (L) is affected by both the object reflectance (r), and the media transmittance (t): L= Itr+a. Here I is the incident light intensity; and a is the scattered light intensity. A perceptual correlate of r is the object's lightness. A perceptual cor...
Article
Full-text available
Biological motion, in the form of point-light displays, is usually less recognizable and coherent when shown from a less natural orientation, and evidence of this disruption was recently extended to audiovisual aspects of biological motion perception. In the present study, eight drummers and eight musical novices were required to judge either the a...
Article
In the present study we applied a paradigm often used in face-voice affect perception to solo music improvisation to examine how the emotional valence of sound and gesture are integrated when perceiving an emotion. Three brief excerpts expressing emotion produced by a drummer and three by a saxophonist were selected. From these bimodal congruent di...
Article
Full-text available
We investigated the effect of musical expertise on sensitivity to asynchrony for drumming point-light displays, which varied in their physical characteristics (Experiment 1) or in their degree of audiovisual congruency (Experiment 2). In Experiment 1, 21 repetitions of three tempos x three accents x nine audiovisual delays were presented to four ja...
Article
The ability to predict the effects of actions is necessary to behave properly in our physical and social world. Here, we describe how the ability to predict the consequence of complex gestures can change the way we integrate sight and sound when relevant visual information is missing. Six drummers and six novices were asked to judge audiovisual syn...
Article
Full-text available
Logvinenko and Maloney (2006) measured perceived dissimilarities between achromatic surfaces placed in two scenes illuminated by neutral lights that could differ in intensity. Using a novel scaling method, they found that dissimilarities between light surface pairs could be represented as a weighted linear combination of two dimensions, "surface li...
Article
This study examined age-related differences in speaker similarity judgments, in which acoustic cues known to be important in speaker and speech recognition and identification were varied. Four groups of listeners of 5- to 6-, 8- to 9-, 10- to 11-, and 25- to 30-year-olds were asked to judge the similarity between an original talker's speech sample...
Article
Two different versions of Adelson's snake lightness illusion are quantitatively investigated. In one experiment an additive version of the illusion is investigated by varying the additive component of the atmosphere transfer function (ATF) introduced by Adelson [2000, in The New Cognitive Neuroscience Ed. M Gazzaniga (Cambridge, MA: MIT Press) pp 3...
Article
Achromatic transparency in 2-D surfaces composed of three adjacent areas, one created from the others, occurs when in the created area it is possible to see the two colours of the adjacent areas. Displays with two white and black intersecting bars were produced to verify the possibility of perceiving transparency in the intersection area when this...

Network

Cited By