Spatial hearing allows us to detect and interpret sounds from out of sight, guiding attention and facilitating interaction with the environment. This ability relies on the brain’s ability to efficiently extract spatial information from sounds and integrate them with signals from other senses (e.g., vision, proprioception) and actions. While simplified paradigms have advanced the understanding of spatial hearing, their lack of ecological validity limits their applicability to real-life conditions. Sound waves interfere with the body and the environment depending on their spatial origin. Filtering properties of the upper body, described by the head-related transfer function (HRTF), provide binaural and monaural auditory cues essential for sound localisation. At the same time, acoustic characteristics of the environment, such as the absorption and reflective properties, alter the structure of the soundscape as a function of its structure. Despite the intrinsic complexity, our brain has developed different perceptual and behavioural strategies to sustain spatial hearing in a variety of scenarios, using directed actions to optimally employ the available information. Understanding how we behave in such natural conditions implies taking into account such complexity. This project aims to address this gap by exploring spatial auditory perception in ecologically valid contexts, combining spatially tracked head-mounted displays (HMDs) and virtual spatial sounds. More in detail, the interplay between spatial hearing and actions was investigated, giving more insights into how individual listener abilities and the acoustic environment influence these types of behaviours in realistic experimental setups. In the first experiment, we studied how acoustic conditions can influence spatial hearing abilities and listening strategies in virtual reality scenarios. Two different groups were tested in an active localisation task in two different acoustic conditions: an anechoic room and a reverberant one. Participants were instructed to localise the source and point the head in the perceived location, at the end of played stimulus. No specific instructions were given on the listening strategy to use during listening. In the second experiment, we investigated the impacts of altered individual auditory cues and the presence of reverberation, in an audiovisual search task. Listeners were instructed to localize an audiovisual stimulus in a 3D space, filled with visual distractors. They were tested in four acoustic conditions, combining two different HRTFs (individual and non-individual) with two different levels of reverberation (anechoic and reverberant conditions). Response times as well as head trajectories data were collected. Results of the first experiment showed an increased propensity for the use of an active strategy in the reverberant group, suggesting a relation between strategy and acoustic environment. Despite the different strategies, localisation performance were comparable between groups. In the second experiment, we found that listeners were significantly quicker and used more precise movements when auditory cues were individualised and reverberation was absent. Altered auditory cues (i.e., with non-individualised HRTF and/or presence of reverberation) worsen the performance. Overall results indicate that reverberation impacts the strategy used for localising sounds, suggesting an adaptive mechanism adopted by the brain to reduce uncertainty due to reverberation. Furthermore, altered auditory cues were found to affect both localisation and listening strategy. This work gave a new understanding of the multisensory nature of spatial hearing, showing how the acoustical context impacts the used listening strategy (i.e., head movements) in realistic settings. These findings highlight the importance of taking into account listening strategies and the acoustic context in hearing assessment tests.

Spatial Hearing Perception in Ecologically-Valid Conditions: Multisensory and Motor Contributions in Auditory Perception

MISSONI, FULVIO
2025

Abstract

Spatial hearing allows us to detect and interpret sounds from out of sight, guiding attention and facilitating interaction with the environment. This ability relies on the brain’s ability to efficiently extract spatial information from sounds and integrate them with signals from other senses (e.g., vision, proprioception) and actions. While simplified paradigms have advanced the understanding of spatial hearing, their lack of ecological validity limits their applicability to real-life conditions. Sound waves interfere with the body and the environment depending on their spatial origin. Filtering properties of the upper body, described by the head-related transfer function (HRTF), provide binaural and monaural auditory cues essential for sound localisation. At the same time, acoustic characteristics of the environment, such as the absorption and reflective properties, alter the structure of the soundscape as a function of its structure. Despite the intrinsic complexity, our brain has developed different perceptual and behavioural strategies to sustain spatial hearing in a variety of scenarios, using directed actions to optimally employ the available information. Understanding how we behave in such natural conditions implies taking into account such complexity. This project aims to address this gap by exploring spatial auditory perception in ecologically valid contexts, combining spatially tracked head-mounted displays (HMDs) and virtual spatial sounds. More in detail, the interplay between spatial hearing and actions was investigated, giving more insights into how individual listener abilities and the acoustic environment influence these types of behaviours in realistic experimental setups. In the first experiment, we studied how acoustic conditions can influence spatial hearing abilities and listening strategies in virtual reality scenarios. Two different groups were tested in an active localisation task in two different acoustic conditions: an anechoic room and a reverberant one. Participants were instructed to localise the source and point the head in the perceived location, at the end of played stimulus. No specific instructions were given on the listening strategy to use during listening. In the second experiment, we investigated the impacts of altered individual auditory cues and the presence of reverberation, in an audiovisual search task. Listeners were instructed to localize an audiovisual stimulus in a 3D space, filled with visual distractors. They were tested in four acoustic conditions, combining two different HRTFs (individual and non-individual) with two different levels of reverberation (anechoic and reverberant conditions). Response times as well as head trajectories data were collected. Results of the first experiment showed an increased propensity for the use of an active strategy in the reverberant group, suggesting a relation between strategy and acoustic environment. Despite the different strategies, localisation performance were comparable between groups. In the second experiment, we found that listeners were significantly quicker and used more precise movements when auditory cues were individualised and reverberation was absent. Altered auditory cues (i.e., with non-individualised HRTF and/or presence of reverberation) worsen the performance. Overall results indicate that reverberation impacts the strategy used for localising sounds, suggesting an adaptive mechanism adopted by the brain to reduce uncertainty due to reverberation. Furthermore, altered auditory cues were found to affect both localisation and listening strategy. This work gave a new understanding of the multisensory nature of spatial hearing, showing how the acoustical context impacts the used listening strategy (i.e., head movements) in realistic settings. These findings highlight the importance of taking into account listening strategies and the acoustic context in hearing assessment tests.
27-mag-2025
Inglese
CANESSA, ANDREA
MASSOBRIO, PAOLO
Università degli studi di Genova
File in questo prodotto:
File Dimensione Formato  
phdunige_4068406.pdf

accesso aperto

Dimensione 3.39 MB
Formato Adobe PDF
3.39 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/211090
Il codice NBN di questa tesi è URN:NBN:IT:UNIGE-211090