Objective: The study investigated the simultaneous processing of emotional tone of voice and emotional facial expression by event-related potentials (ERPs), through an ample range of different emotions (happiness, sadness, fear, anger, surprise, and disgust). Participants and Methods: Auditory emotional stimuli (a neutral word pronounced in an affective tone) and visual patterns (an emotional facial expression) were matched in congruous (the same emotion in face and voice) and incongruous (different emotions) pairs. Fourty one subjects were submitted these patterns. Results: ERPs variations and behavioral data (response time, RT) were submitted to repeated measures analysis of variance. We considered the 150-250 ms poststimulus time-window in order to explore the cognitive processes (long latency ERP variations) in addiction to the perceptive (early ERP variations), fully investigated by previous researches. ANOVA showed two different ERP effects, a negative deflection (N2), more anterior-distributed (Fz), and a positive deflection (P2), more posterior-distributed, with different cognitive functions. Conclusions: P200, which is highly sensible to pattern congruent/incongruent condition (with more intense amplitude for congruent rather then incongruent stimuli) constitutes an intersensory integration marker. On the contrary, N2, more sensible to the type of emotion, may be a marker of the emotional content displayed. Finally, behavioral results indicate that congruence causes a RT reduction for some emotions (sadness) and an inverse effect for other emotions (fear, anger, surprise). This result is discussed with reference to the different adaptive functions of the emotional correlates.
|Numero di pagine||1|
|Rivista||Journal of the International Neuropsychological Society|
|Stato di pubblicazione||Pubblicato - 2007|
- cross-modal perception