The ability to perceive and integrate affective information across sensory modalities is fundamental to infants' emerging socioemotional development. In the first year of life, preverbal infants rely heavily on nonverbal cues, such as facial expressions and vocal prosody, to interpret emotions and navigate social interactions. Understanding how infants process these multimodal emotional signals provides critical insight into the developmental foundations of emotion perception and social communication. This thesis investigates the developmental trajectory of multimodal emotion perception in infancy through three complementary studies that incorporate evidence from a systematic review, behavioral experiments, and neuroimaging. Study 1 presents a systematic review that synthesizes empirical evidence on infants’ multimodal perception of affective information across facial and vocal modalities during the first year of life. Findings from 47 experiments indicate that by 5 to 7 months of age, infants demonstrate sensitivity to the congruency between facial and vocal expressions, often exhibiting a preference for happy stimuli. The review further highlights that dynamic and socially relevant stimuli enhance intermodal matching performance, and the brain exhibits differential patterns in multimodal emotion processing. Study 2 reports a behavioral experiment involving 35 infants aged 6 to 8 months, which employed a visual preference-based intermodal matching paradigm to assess their ability to match emotions across sensory modalities. Infants were presented with face–voice emotional pairings under three conditions: happy–angry, happy–neutral, and angry–neutral. Results indicate that infants reliably matched congruent happy expressions, exhibiting a visual preference for happy faces when paired with a corresponding vocal emotion. Additionally, emotion-matching performance was significantly correlated with both age and family expressiveness, as measured by parent-reported questionnaires, suggesting that both developmental stage and environmental factors contribute to the early perception of emotion. Study 3 investigates the neural mechanisms underlying infants’ sensitivity to multimodal emotional congruency using functional near-infrared spectroscopy (fNIRS). Infants aged 6 to 7 months are presented with congruent (e.g., happy face + happy voice) and incongruent (e.g., happy face + angry voice) emotional stimuli, while hemodynamic responses are recorded from bilateral temporal and prefrontal regions. Results are expected to show differential neural activation between congruent and incongruent emotional conditions, suggesting that infants exhibit distinct neural responses to multimodal affective cues. This study aims to provide neurophysiological evidence of early affective integration and to identify the cortical brain regions involved in processing multimodal emotional information during infancy. Taken together, these studies offer converging evidence for the early emergence of multimodal emotion perception in infancy and its modulation by both developmental and social environment factors. The findings contribute to a growing understanding of how infants begin to interpret complex affective information and may inform future research into the neural and experiential mechanisms that support early socioemotional development.

Seeing and hearing emotions: Multimodal affective perception in preverbal infants

JIA, CHUCHU
2026

Abstract

The ability to perceive and integrate affective information across sensory modalities is fundamental to infants' emerging socioemotional development. In the first year of life, preverbal infants rely heavily on nonverbal cues, such as facial expressions and vocal prosody, to interpret emotions and navigate social interactions. Understanding how infants process these multimodal emotional signals provides critical insight into the developmental foundations of emotion perception and social communication. This thesis investigates the developmental trajectory of multimodal emotion perception in infancy through three complementary studies that incorporate evidence from a systematic review, behavioral experiments, and neuroimaging. Study 1 presents a systematic review that synthesizes empirical evidence on infants’ multimodal perception of affective information across facial and vocal modalities during the first year of life. Findings from 47 experiments indicate that by 5 to 7 months of age, infants demonstrate sensitivity to the congruency between facial and vocal expressions, often exhibiting a preference for happy stimuli. The review further highlights that dynamic and socially relevant stimuli enhance intermodal matching performance, and the brain exhibits differential patterns in multimodal emotion processing. Study 2 reports a behavioral experiment involving 35 infants aged 6 to 8 months, which employed a visual preference-based intermodal matching paradigm to assess their ability to match emotions across sensory modalities. Infants were presented with face–voice emotional pairings under three conditions: happy–angry, happy–neutral, and angry–neutral. Results indicate that infants reliably matched congruent happy expressions, exhibiting a visual preference for happy faces when paired with a corresponding vocal emotion. Additionally, emotion-matching performance was significantly correlated with both age and family expressiveness, as measured by parent-reported questionnaires, suggesting that both developmental stage and environmental factors contribute to the early perception of emotion. Study 3 investigates the neural mechanisms underlying infants’ sensitivity to multimodal emotional congruency using functional near-infrared spectroscopy (fNIRS). Infants aged 6 to 7 months are presented with congruent (e.g., happy face + happy voice) and incongruent (e.g., happy face + angry voice) emotional stimuli, while hemodynamic responses are recorded from bilateral temporal and prefrontal regions. Results are expected to show differential neural activation between congruent and incongruent emotional conditions, suggesting that infants exhibit distinct neural responses to multimodal affective cues. This study aims to provide neurophysiological evidence of early affective integration and to identify the cortical brain regions involved in processing multimodal emotional information during infancy. Taken together, these studies offer converging evidence for the early emergence of multimodal emotion perception in infancy and its modulation by both developmental and social environment factors. The findings contribute to a growing understanding of how infants begin to interpret complex affective information and may inform future research into the neural and experiential mechanisms that support early socioemotional development.
23-mar-2026
Inglese
FARRONI, TERESA
Università degli studi di Padova
File in questo prodotto:
File Dimensione Formato  
Final_Thesis_Chuchu_Jia.pdf

accesso aperto

Licenza: Tutti i diritti riservati
Dimensione 6.61 MB
Formato Adobe PDF
6.61 MB Adobe PDF Visualizza/Apri

I documenti in UNITESI sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14242/362210
Il codice NBN di questa tesi è URN:NBN:IT:UNIPD-362210